diff --git a/README.md b/README.md index e9bba75..f8377c7 100644 --- a/README.md +++ b/README.md @@ -1,9 +1,67 @@ -# vision_landing +# Vision Landing 2 -_**WARNING: This project is currently discontinued and is of academic interest only. ArduPilot and PX4 autopilots do not yet safely support vision based precision landing. PLEASE DO NOT USE THIS CODE other than for experimental or learning purposes. IT WILL behave dangerously. You have been warned.**_ +_**WARNING: Use at your own risk **_ ### Precision landing using visual targets. -This is a project to achieve precision landing on drones using ArduCopter firmware, using (monocular) vision alone. Fiducial markers are printed and used as landing targets, and these targets provide orientation, location and distance information when combined with accurate size information of the markers and calibrated camera information. No rangefinder is necessary, as the distance to target is obtained automatically through pose estimation of the markers. + +This is a project to achieve precision landing on drones, using (monocular) vision alone. Fiducial markers are printed and used as landing targets, and these targets provide orientation, location and distance information when combined with accurate size information of the markers and calibrated camera information. No rangefinder is necessary, as the distance to target is obtained automatically through pose estimation of the markers. + +This is a improved version of https://github.com/goodrobots/vision_landing with the following additional features: + +* Uses AprilTags. +* Allows to define a Landing Point relative to multiple markers (not limited to marker centers). This also solves the problem of bouncing between detected markers. +* Does the pose estimation using the biggest marker which offers a better pose estimation (more pixels to detect). Once out of the visual field, the next biggest detected marker will be used. +* Supports a JSON configuration file (TODO: this should replace the old vision_landing.conf). +* Implements an alternative input source using tcp sockets to obtain raw frames with less latency, less CPU ussage and better quality (used together with [RosettaDrone](https://github.com/RosettaDrone/rosettadrone)). The drone's yaw and an image timestamp is also sent together with the images. +* Many bug fixes and improvements. +* Merges the ideas and code of this alternative implementation. See: https://github.com/chobitsfan/apriltag_plnd/issues/1 +* Integrates the [SmartLanding](https://github.com/RosettaDrone/SmartLanding) algorithm framework and implements a flight controller to actually land the drone using the [SET_POSITION_TARGET_LOCAL_NED](https://mavlink.io/en/messages/common.html#SET_POSITION_TARGET_LOCAL_NED) and [MAV_CMD_NAV_LAND](https://mavlink.io/en/messages/common.html#MAV_CMD_NAV_LAND) MAVLink commands. Alternatively, you may also just send the [LANDING_TARGET](https://mavlink.io/en/messages/common.html#LANDING_TARGET) command and let the drone's flight controller perform the landing. + +For testing, you can use this camera simulator to generate and stream a scene with markers from the viewpoint of a drone controlled via MAVLink: +https://github.com/kripper/mavlink-camera-simulator/ + +Screenshot of 3 markers of different sizes. The biggest visible marker is used to estimate the pose of the relative landing point (red cross). + +![image](https://user-images.githubusercontent.com/1479804/226492709-68e153fe-f34d-4182-aac5-12cd4f482599.png) + +### Marker Offsets + +The best strategy is to set the landing point on the center of the smallest marker and have all other bigger markers referencing this landing point using offsets. +This way the drone will be able to see at least the smallest marker until landing on the ground. + +The offsets (`offsetX` and `offsetY`) must be configured in the `config.json` file. + +To automatically compute the offsets, you can pass the `--get-offsets` argument to the `track_targets` binary. +The smallest marker should have the offset `(0,0)` to point to its center. + +![image](https://user-images.githubusercontent.com/1479804/228932515-d2f5df8b-ed29-492a-b984-4cb42f768e69.png) + +### Running track_targets directly + +Before running `vision_landing`, it is recommended to try running `track_targets` directly. + +This is the sintax: + +`./track_targets [option...] --width --height --fps -o ` + +Example command line: + +`./track_targets --get-offsets=85 --width 1280 --height 720 --fps 15 -o 'appsrc ! videoconvert ! videorate ! openh264enc bitrate=1000000 ! rtph264pay ! udpsink host=laptop port=5000' 'udpsrc port=5000 ! application/x-rtp, encoding-name=H264, payload=96 ! rtph264depay ! avdec_h264 ! capsfilter caps="video/x-raw, format=(string)I420, width=(int)1280, height=(int)720, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, framerate=(fraction)25/1" ! videoconvert ! appsink' calibration/dji-mini-se-1280x720.yml` + +To receive the output stream with the augmented reality drawings, use: + +`gst-launch-1.0 -v udpsrc port=5000 ! application/x-rtp, encoding-name=H264 ! queue ! rtph264depay ! h264parse ! avdec_h264 ! autovideosink` + +### Video + +Here is a video of my first basic test landing a DJI Mini SE with Vision Landing 2 and [RosettaDrone](https://github.com/RosettaDrone/rosettadrone): + +[![First basic test](https://img.youtube.com/vi/tOXuLmG5JBc/0.jpg)](https://www.youtube.com/watch?v=tOXuLmG5JBc) + +--- + +The rest of this document is a copy of the original Vision Landing project which only supported Aruco markers. +You can calibrate the camera using Aruco markers as explained below and then use AprilTag markers for the landing. Demonstrations -------------------- @@ -100,7 +158,7 @@ There are two main components: track_targets must be compiled and installed into the main directory before vision_landing can be run. vision_landing calls track_targets to do the actual target detection and vector calculations. ``` - git clone https://github.com/goodrobots/vision_landing + git clone https://github.com/RosettaDrone/vision-landing-2.git cd vision_landing/src cmake . && make && make install ``` diff --git a/calibration/dji-mini-se-1280x720.yml b/calibration/dji-mini-se-1280x720.yml new file mode 100644 index 0000000..6e19053 --- /dev/null +++ b/calibration/dji-mini-se-1280x720.yml @@ -0,0 +1,17 @@ +%YAML:1.0 +--- +image_width: 1280 +image_height: 720 +camera_matrix: !!opencv-matrix + rows: 3 + cols: 3 + dt: d + data: [ 1.0095704757390205e+03, 0., 6.2537194135269283e+02, 0., + 1.0114414327487788e+03, 3.7032175215787225e+02, 0., 0., 1. ] +distortion_coefficients: !!opencv-matrix + rows: 1 + cols: 5 + dt: d + data: [ 1.3335449381174928e-02, -1.5682765158136096e-02, + 3.9366517725796467e-04, -3.0342333029495221e-03, + -4.7797052205165239e-02 ] diff --git a/config.json b/config.json new file mode 100644 index 0000000..d8f0966 --- /dev/null +++ b/config.json @@ -0,0 +1,22 @@ +{ + "markers": [ + { + "id": 199, + "size": 0.135, + "offsetX": 0.026349, + "offsetY": 0.141789 + }, + { + "id": 118, + "size": 0.03, + "offsetX": 0.046997, + "offsetY": -0.013023 + }, + { + "id": 85, + "size": 0.014, + "offsetX": 0.0, + "offsetY": 0.0 + } + ] +} \ No newline at end of file diff --git a/src/CMakeLists.txt b/src/CMakeLists.txt index 01caed4..4c81bb8 100644 --- a/src/CMakeLists.txt +++ b/src/CMakeLists.txt @@ -1,23 +1,23 @@ -cmake_minimum_required (VERSION 2.8) -project (track_targets) -find_package(aruco REQUIRED ) -find_package(OpenCV REQUIRED) - -option(USE_TIMERS "Set to OFF to disable timers" OFF) -iF(USE_TIMERS) -add_definitions(-DUSE_TIMERS) -ENDIF() - -add_executable(track_targets track_targets.cpp) - -set(EXTRA_C_FLAGS_RELEASE "${EXTRA_C_FLAGS_RELEASE} -std=c++0x -pthread -march=armv8-a+crc -mfpu=neon-vfpv4 -mtune=cortex-a53 -ftree-vectorize -mfloat-abi=hard -O3 ") -set(cpp_compile_flags "-std=gnu++11 -pthread") -add_definitions(${cpp_compile_flags}) - -include_directories(${aruco_INCLUDE_DIRS} ${OpenCV_INCLUDE_DIRS}) -target_link_libraries(track_targets aruco ${aruco_LIBS} ${OpenCV_LIBS} pthread) -link_directories(${aruco_LIB_DIR} ${OpenCV_INSTALL_PATH}/lib}) -link_libraries(pthread) -install(PROGRAMS track_targets DESTINATION ${CMAKE_CURRENT_SOURCE_DIR}/..) - -set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -latomic") +cmake_minimum_required (VERSION 2.8) +project (track_targets) +find_package(aruco REQUIRED ) +find_package(OpenCV REQUIRED) + +option(USE_TIMERS "Set to OFF to disable timers" OFF) +iF(USE_TIMERS) +add_definitions(-DUSE_TIMERS) +ENDIF() + +add_executable(track_targets track_targets.cpp apriltag.cpp raw-tcp-video.cpp) + +set(EXTRA_C_FLAGS_RELEASE "${EXTRA_C_FLAGS_RELEASE} -std=c++0x -pthread -march=armv8-a+crc -mfpu=neon-vfpv4 -mtune=cortex-a53 -ftree-vectorize -mfloat-abi=hard -O3 ") +set(cpp_compile_flags "-std=gnu++11 -pthread") +add_definitions(${cpp_compile_flags}) + +include_directories(${aruco_INCLUDE_DIRS} ${OpenCV_INCLUDE_DIRS}) +target_link_libraries(track_targets aruco ${aruco_LIBS} ${OpenCV_LIBS} pthread) +link_directories(${aruco_LIB_DIR} ${OpenCV_INSTALL_PATH}/lib}) +link_libraries(pthread) +install(PROGRAMS track_targets DESTINATION ${CMAKE_CURRENT_SOURCE_DIR}/..) + +set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -latomic -lapriltag") diff --git a/src/apriltag.cpp b/src/apriltag.cpp new file mode 100644 index 0000000..fab6409 --- /dev/null +++ b/src/apriltag.cpp @@ -0,0 +1,305 @@ +#include +#include +#include +#include +#include + +#include + +#include "apriltag.h" + +#include +#include +#include +//#include + +const bool DEBUG = false; // Slower, but processes all markers and provides more info +int COMPUTE_OFFSETS = false; // Keep disabled. May crash with SIGSEV when unable to compute + +using std::cout; +using cv::Mat; +using cv::Point2f; +using cv::Scalar; + +apriltag_detector_t *td; +apriltag_family_t *tf; +matd_t* tgt_offset; + +// Camera callibration properties +apriltag_detection_info_t det_info; + +cv::Mat zeroRotation = cv::Mat::zeros(3, 1, CV_64FC1); // No rotation +cv::Mat zeroTranslation = cv::Mat::zeros(3, 1, CV_64FC1); // No translation + +using nlohmann::json; + +extern bool test; +extern json CONFIG; + +std::unordered_map markersById; + +void init_markers() { + // Index markers by id + for (const auto& marker : CONFIG["markers"]) { + markersById[marker["id"]] = marker; + } +} + +void apriltag_init(const aruco::CameraParameters &CP) { + td = apriltag_detector_create(); + + // Apriltag families to detect + + tf = tagStandard41h12_create(); // Recommended by the dev team. See https://github.com/AprilRobotics/apriltag/wiki/AprilTag-User-Guide#choosing-a-tag-family + apriltag_detector_add_family(td, tf); + + //tf = tag36h11_create(); // Supports more IDs + //apriltag_detector_add_family(td, tf); + + td->quad_decimate = 2; + td->nthreads = 4; + + tgt_offset = matd_create(3, 1); + tgt_offset->data[2] = 0; + + // Initialize apriltag detector with aruco camera matrix + Mat camera_matrix = CP.CameraMatrix; + det_info.fx = camera_matrix.at(0, 0); // Focal lengths + det_info.fy = camera_matrix.at(1, 1); + det_info.cx = camera_matrix.at(0, 2); // Principal points of a camera + det_info.cy = camera_matrix.at(1, 2); + + init_markers(); +} + +void drawARLandingCube(Mat &Image, aruco::Marker &m, const aruco::CameraParameters &CP); + +Mat matd_to_cvmat(matd_t* mat) +{ + int rows = mat->nrows; + int cols = mat->ncols; + Mat cvMat(rows, cols, CV_64FC1); + + for (int i = 0; i < rows; i++) + { + for (int j = 0; j < cols; j++) + { + double val = matd_get(mat, i, j); + cvMat.at(i, j) = val; + } + } + + return cvMat; +} + +bool apriltag_process_image(Mat img, aruco::CameraParameters& CP, LandingTarget& out) { + Mat gray_image; + cv::cvtColor(img, gray_image, cv::COLOR_BGR2GRAY); + bool found = false; + + unsigned char* buffer = gray_image.data; + int rows = gray_image.rows; + int cols = gray_image.cols; + + image_u8_t img_header = { .width = cols, .height = rows, .stride = cols, .buf = buffer }; + apriltag_pose_t pose; + +#if SPEED_TEST + struct timespec start, stop; + clock_gettime(CLOCK_MONOTONIC, &start); +#endif + + zarray_t *detections = apriltag_detector_detect(td, &img_header); + + // Sort the detections by size using zarray_sort (biggers first) + // We prefer to use the pose estimation of the biggest marker because it provides a better estimation + // NOTE: small markers can have very big z-component errors (over 2 [m]) because pixel error is amplified. Anyway, the projected landing point on the (x,y) plane won't change that much) + zarray_sort(detections, [](const void* p1, const void* p2) -> int { + apriltag_detection_t *det1 = *(apriltag_detection_t**) p1; + apriltag_detection_t *det2 = *(apriltag_detection_t**) p2; + + // Leave unknown markers (is_null) at the end + return markersById[det1->id].is_null() ? 1 : markersById[det1->id]["size"] < markersById[det2->id]["size"]; + }); + + matd_t* landingPoint = NULL; // First detected landing point (of the biggest marker). We use it as a reference for other markers. + + if(test) cout << "---\n"; // New frame + + for (int i = 0; i < zarray_size(detections); i++) { + apriltag_detection_t *det; + zarray_get(detections, i, &det); + + if(test) cout << "Tag " << det->id << "\t"; + + json marker = markersById[det->id]; + if(marker.is_null()) { + if(test) cout << "Unknown\n"; + continue; + } + + found = true; + + det_info.det = det; + det_info.tagsize = marker["size"]; + tgt_offset->data[0] = marker["offsetX"]; + tgt_offset->data[1] = marker["offsetY"]; + + estimate_tag_pose(&det_info, &pose); + + // Compute projected landing point (LP) in 3D (unit is meters) + matd_t* op = matd_multiply(pose.R, tgt_offset); // offset point + matd_t* markerLP = matd_add(op, pose.t); + + out.id = det->id; + out.x = markerLP->data[0]; + out.y = markerLP->data[1]; + out.z = markerLP->data[2]; + const double pi = std::atan(1) * 4; + out.angle = (180 / pi) * std::atan2(pose.R->data[1], pose.R->data[0]); // Angle of the marker in degrees (counter clockwise) + + // TODO: Add: << markerLP->data[0] << ", " << markerLP->data[1] << ", " << markerLP->data[2] << "), " << pose.R->data[3] << ", " << pose.R->data[0] << "\n"; + // cout << "LP: (" << tgt_offset->data[0] << ", " << tgt_offset->data[1] << ")\n"; + if(test) { + if(!COMPUTE_OFFSETS || landingPoint) { + cout << "LP: (" << out.x << ", " << out.y << ", " << out.z << "), angle: " << out.angle << ", offset: (" << tgt_offset->data[0] << ", " << tgt_offset->data[1] << ")\n"; + } + } + + if(1) { + // Render marker corners + cv::line(img, cv::Point(det->p[0][0], det->p[0][1]), cv::Point(det->p[1][0], det->p[1][1]), Scalar(0, 0, 255), 2); + cv::line(img, cv::Point(det->p[1][0], det->p[1][1]), cv::Point(det->p[2][0], det->p[2][1]), Scalar(0, 0, 255), 2); + cv::line(img, cv::Point(det->p[2][0], det->p[2][1]), cv::Point(det->p[3][0], det->p[3][1]), Scalar(0, 0, 255), 2); + cv::line(img, cv::Point(det->p[3][0], det->p[3][1]), cv::Point(det->p[0][0], det->p[0][1]), Scalar(0, 0, 255), 2); + } else { + // Render marker cube + aruco::Marker m; + m.ssize = det_info.tagsize; + m.Rvec = matd_to_cvmat(pose.R); + m.Tvec = matd_to_cvmat(pose.t); + drawARLandingCube(img, m, CP); + } + + if(COMPUTE_OFFSETS) { + if(landingPoint) { + if(det->id == COMPUTE_OFFSETS) continue; + + if(1) { + // Show error to previous computed landing point + double ex = pow(markerLP->data[0] - landingPoint->data[0], 2); + double ey = pow(markerLP->data[1] - landingPoint->data[1], 2); + double ez = pow(markerLP->data[2] - landingPoint->data[2], 2); + double err = sqrt(ex + ey + ez); + + // Error of the projected landing point for this marker (relative to the projected landing point of the biggest marker) + cout << "\tERR: (" << ex << ", " << ey << ", " << ez << ") = " << err << "\n"; + } + + if(COMPUTE_OFFSETS) { + // Tool for computing offsets for rest of the markers based on first detected marker. + // USAGE: 1) Set the offset only on the biggest marker, 2) make sure the biggest marker is detected, 3) copy the "Computed offset" values in config.json + // NOTE: The z-component can have big errors and this can affect the computed offset. That's also why we prefer the biggest marker. + // NOTE: Returns garbage when marker sizes or z-component are incorrect. + + matd_t* _invR = matd_inverse(pose.R); + matd_t* _diff = matd_subtract(landingPoint, pose.t); + matd_t* _offset = matd_multiply(_invR, _diff); + + cout << "\tComputed offsets:\n\t\t\"offsetX\": " << _offset->data[0] << ",\n\t\t\"offsetY\": " << _offset->data[1] << "\n"; + + matd_destroy(_invR); + matd_destroy(_diff); + matd_destroy(_offset); + } + + } else if(COMPUTE_OFFSETS && det->id == COMPUTE_OFFSETS) { + landingPoint = markerLP; + + // Repeat + i = -1; + continue; + } + } + + // --- Render landing point relative to marker --- + + Mat objectPoints(2, 3, CV_32FC1); + + // Marker center in world coordinates + objectPoints.at(0, 0) = pose.t->data[0]; + objectPoints.at(0, 1) = pose.t->data[1]; + objectPoints.at(0, 2) = pose.t->data[2]; + + // markerLP (landing point) in world coordinates + objectPoints.at(1, 0) = markerLP->data[0]; + objectPoints.at(1, 1) = markerLP->data[1]; + objectPoints.at(1, 2) = markerLP->data[2]; + + // Draw line from marker center to LP + std::vector imagePoints; + cv::projectPoints(objectPoints, zeroRotation, zeroTranslation, CP.CameraMatrix, CP.Distorsion, imagePoints); + cv::line(img, imagePoints[0], imagePoints[1], Scalar(255, 0, 0, 255), 1, cv::LINE_AA); + + // Draw cross on LP + Point2f dh = {10.0, 0}; + Point2f dv = {0, 10.0}; + cv::line(img, imagePoints[1] - dh, imagePoints[1] + dh, Scalar(0, 0, 255, 255), 1, cv::LINE_AA); + cv::line(img, imagePoints[1] - dv, imagePoints[1] + dv, Scalar(0, 0, 255, 255), 1, cv::LINE_AA); + + if(0) { + /** + Code that recevies the ipc_data: + See: https://github.com/chobitsfan/mavlink-udp-proxy/blob/new_main/my_mavlink_udp.c + + double tag_pose[6] = {0}; + if (recv(ipc_fd, tag_pose, sizeof(tag_pose), 0) > 0) { + gettimeofday(&tv, NULL); + float q[4] = {1, 0, 0, 0}; + int tag_id = tag_pose[0]; + double cam_r = tag_pose[1]; + double cam_d = tag_pose[2]; + double cam_f = tag_pose[3]; + mavlink_msg_landing_target_pack(mav_sysid, MY_COMP_ID, &msg, tv.tv_sec*1000000+tv.tv_usec, tag_id, MAV_FRAME_BODY_FRD, 0, 0, sqrt(cam_r*cam_r+cam_d*cam_d+cam_f*cam_f), 0, 0, -cam_d, cam_r, cam_f, q, 0, 1); + len = mavlink_msg_to_send_buffer(buf, &msg); + write(uart_fd, buf, len); + } + */ + double ipc_data[6]; + + ipc_data[0] = markerLP->data[0]; + ipc_data[1] = markerLP->data[1]; + ipc_data[2] = markerLP->data[2]; + + if (det->id == 0) { + ipc_data[3] = pose.R->data[3]; // I only need these 2 elements to compute yaw + ipc_data[4] = pose.R->data[0]; + } else { + ipc_data[3] = ipc_data[4] = 0; + } + + ipc_data[5] = det->id; + + // Send results + //sendto(ipc_fd, ipc_data, sizeof(ipc_data), 0, (const struct sockaddr *)&server, sizeof(server)); + } + + matd_destroy(op); + matd_destroy(markerLP); + matd_destroy(pose.t); + matd_destroy(pose.R); + + if(!DEBUG && !COMPUTE_OFFSETS) break; // Just use first detected marker + } + +#if SPEED_TEST + if (zarray_size(detections) == 0) { + clock_gettime(CLOCK_MONOTONIC, &stop); + printf("%d\n", (int)((stop.tv_sec-start.tv_sec)*1000+(stop.tv_nsec-start.tv_nsec)*1e-6)); + } +#endif + + apriltag_detections_destroy(detections); + + return found; +} diff --git a/src/apriltag.h b/src/apriltag.h new file mode 100644 index 0000000..58c7f4e --- /dev/null +++ b/src/apriltag.h @@ -0,0 +1,12 @@ +#include "aruco/aruco.h" + +struct LandingTarget { + int id; + double x; + double y; + double z; + double angle; +}; + +void apriltag_init(const aruco::CameraParameters &CP); +bool apriltag_process_image(cv::Mat img, aruco::CameraParameters& CamParam, LandingTarget& out); diff --git a/src/raw-tcp-video.cpp b/src/raw-tcp-video.cpp new file mode 100644 index 0000000..93dc3ad --- /dev/null +++ b/src/raw-tcp-video.cpp @@ -0,0 +1,157 @@ +#include +#include +#include +#include +#include +#include + +#include +#include +#include + +using namespace cv; +using namespace std; + +int server_socket, client_socket; +struct sockaddr_in server_address, client_address; + +double lastTime; + +void check_exit(); +double CLOCK(); + +void accept_connection() { + cout << "Waiting for video streamer...\n"; + + socklen_t client_address_size = sizeof(client_address); + + for(;;) { + client_socket = accept(server_socket, (struct sockaddr *)&client_address, &client_address_size); + if (client_socket < 0) { + if (errno == EWOULDBLOCK || errno == EAGAIN) { + cout << "Retrying accept\n"; + check_exit(); + continue; + } + std::cerr << "Accept failed\n"; + exit(1); + + } else { + break; // Connected + } + } + + cout << "Video streamer connected.\n"; +} + +void raw_tcp_reconnect() { + cout << "Client disconnected. Reconnecting...\n"; + close(client_socket); + accept_connection(); +} + +void init_video_socket() { + server_socket = socket(AF_INET, SOCK_STREAM, 0); + if (server_socket == -1) { + std::cerr << "Could not create socket\n"; + exit(1); + } + + // Prepare the sockaddr_in structure + server_address.sin_family = AF_INET; + server_address.sin_addr.s_addr = INADDR_ANY; + server_address.sin_port = htons(6000); + + // Bind the socket to the address and port + if (bind(server_socket, (struct sockaddr *)&server_address, sizeof(server_address)) < 0) { + std::cerr << "Bind failed\n"; + exit(1); + } + + // Listen for incoming connections + listen(server_socket, 3); + + accept_connection(); +} + +void close_video_socket() { + // Close sockets + close(client_socket); + close(server_socket); +} + +// TODO: OPT: Test working directly with yuvMat +Mat raw_tcp_get_image(uint64_t &ts, float &yaw) { + std::streamsize bytes; + int consecutiveErrors = 0; + for(;;) { + int w, h, size; + + bool err = false; + + if(!recv(client_socket, &w, sizeof(w), 0)) { + raw_tcp_reconnect(); + continue; + } + + if( + recv(client_socket, &h, sizeof(h), 0) <= 0 + || recv(client_socket, &ts, sizeof(ts), 0) <= 0 + || recv(client_socket, &yaw, sizeof(yaw), 0) <= 0 + || recv(client_socket, &size, sizeof(size), 0) <= 0 + ) { + err = true; + size = 0; + } + + char buffer[size]; + + int bytes_received = 0; + if(!err) { + // TODO: if socket is blocking, while is not necessary + while (bytes_received < size) { + int bytes = recv(client_socket, buffer + bytes_received, size - bytes_received, 0); + if (bytes == -1) { + err = true; + break; + /* + cout << "."; + continue; + */ + + } else if (!bytes) { + accept_connection(); + continue; + } + bytes_received += bytes; + } + } + + if(err) { + check_exit(); + if(!consecutiveErrors) cout << "Retrying...\n"; + consecutiveErrors++; + + if(consecutiveErrors > 100) sleep(1); // Client probably paused => relax CPU + + } else { + consecutiveErrors = 0; + + double now = CLOCK(); + + cout << "Got image"; + + if(lastTime) cout << ", ts: " << ts << ", ms: " << (now - lastTime); + lastTime = now; + + cout << ", bytes: " << bytes_received << "\n"; + cout.flush(); + + Mat yuvMat(h + h / 2, w, CV_8UC1, buffer); + Mat rgbMat; + cvtColor(yuvMat, rgbMat, COLOR_YUV2BGR_I420); + + return rgbMat; + } + } +} diff --git a/src/raw-tcp-video.h b/src/raw-tcp-video.h new file mode 100644 index 0000000..32b536b --- /dev/null +++ b/src/raw-tcp-video.h @@ -0,0 +1,10 @@ +// --- Raw Video Stream --- + +// This is an alternative image grabber that raw images send via tcp +// To enable, use input arg = "raw-tcp" + +void accept_connection(); +void raw_tcp_reconnect(); +void init_video_socket(); +void close_video_socket(); +Mat raw_tcp_get_image(uint64_t &ts, float &yaw); diff --git a/src/track_targets.cpp b/src/track_targets.cpp index 5badc0f..7a64029 100644 --- a/src/track_targets.cpp +++ b/src/track_targets.cpp @@ -26,6 +26,8 @@ Run separately with: ./track_targets -d TAG36h11 /dev/video0 calibration.yml 0.2 #include #include #include +#include +#include #include "args.hxx" #include "pkQueueTS.hpp" @@ -37,26 +39,61 @@ Run separately with: ./track_targets -d TAG36h11 /dev/video0 calibration.yml 0.2 #include "aruco/aruco.h" #include "aruco/timers.h" +#include "apriltag.h" +#include "raw-tcp-video.h" + using namespace cv; using namespace aruco; +using namespace std; + +using nlohmann::json; + +json CONFIG; + +// TODO: Migrate vision_landing.conf to config.json +void load_config() { + std::ifstream ifs("config.json"); + if (!ifs.is_open()) { + cout << "Couldn't open 'config.json'" << endl; + exit(1); + } + ifs >> CONFIG; +} -// Setup sig handling static volatile sig_atomic_t sigflag = 0; static volatile sig_atomic_t stateflag = 0; // 0 = stopped, 1 = started + +const bool useApriltag = true; +bool test = false; +const bool isCamera = false; // Disable when input is not a camera. Otherwise the program will hang. +bool useVideoSocket = false; +#include + +int inputwidth = 640; +int inputheight = 480; +int frameSize; + +std::thread inThread; + +// Setup sig handling void handle_sig(int sig) { - std::cout << "info:SIGNAL:" << sig << ":Received" << std::endl; + cout << "info:SIGNAL:" << sig << ":Received" << std::endl; sigflag = 1; + pthread_kill(inThread.native_handle(), SIGUSR1); // Interrupt recv() and accept() } void handle_sigusr1(int sig) { - std::cout << "info:SIGNAL:SIGUSR1:Received:" << sig << ":Switching on Vision Processing" << std::endl; + cout << "info:SIGNAL:SIGUSR1:Received:" << sig << ":Switching on Vision Processing" << std::endl; stateflag = 1; } void handle_sigusr2(int sig) { - std::cout << "info:SIGNAL:SIGUSR2:Received:" << sig << ":Switching off Vision Processing" << std::endl; + cout << "info:SIGNAL:SIGUSR2:Received:" << sig << ":Switching off Vision Processing" << std::endl; stateflag = 0; + + // TODO: Necessary? + pthread_kill(inThread.native_handle(), SIGUSR1); // Interrupt recv() and accept() } // Setup fps tracker @@ -66,6 +103,14 @@ double CLOCK() clock_gettime(CLOCK_MONOTONIC, &t); return (t.tv_sec * 1000) + (t.tv_nsec * 1e-6); } + +void check_exit() { + if(sigflag == 1) exit(0); +} + +extern int server_socket, client_socket; +extern struct sockaddr_in server_address, client_address; + double _avgdur = 0; double _fpsstart = 0; double _avgfps = 0; @@ -96,12 +141,15 @@ void drawARLandingCube(cv::Mat &Image, Marker &m, const CameraParameters &CP) objectPoints.at(0, 0) = -halfSize; objectPoints.at(0, 1) = -halfSize; objectPoints.at(0, 2) = 0; + objectPoints.at(1, 0) = halfSize; objectPoints.at(1, 1) = -halfSize; objectPoints.at(1, 2) = 0; + objectPoints.at(2, 0) = halfSize; objectPoints.at(2, 1) = halfSize; objectPoints.at(2, 2) = 0; + objectPoints.at(3, 0) = -halfSize; objectPoints.at(3, 1) = halfSize; objectPoints.at(3, 2) = 0; @@ -109,12 +157,15 @@ void drawARLandingCube(cv::Mat &Image, Marker &m, const CameraParameters &CP) objectPoints.at(4, 0) = -halfSize; objectPoints.at(4, 1) = -halfSize; objectPoints.at(4, 2) = m.ssize; + objectPoints.at(5, 0) = halfSize; objectPoints.at(5, 1) = -halfSize; objectPoints.at(5, 2) = m.ssize; + objectPoints.at(6, 0) = halfSize; objectPoints.at(6, 1) = halfSize; objectPoints.at(6, 2) = m.ssize; + objectPoints.at(7, 0) = -halfSize; objectPoints.at(7, 1) = halfSize; objectPoints.at(7, 2) = m.ssize; @@ -178,6 +229,7 @@ struct TimeCapsuleVars { uint64_t timeStamp = 0; uint64_t frameNum = 0; + float yaw = 0; }; typedef MatCapsule_ MatTimeCapsule; pkQueueTS incomingQueue; @@ -189,25 +241,37 @@ cv::VideoCapture vreader; // Thread to control capture device and push frames onto the threadsafe queue void incomingThread() { + signal(SIGINT, handle_sig); // Required for interupting raw_tcp_get_image > accept() + signal(SIGUSR1, handle_sigusr1); // Required for interupting raw_tcp_get_image > recv() + // Enable capture input grabState = true; // Incoming frame MatTimeCapsule iframe; uint64_t frameNum = 0; - std::cout << "Starting incomingThread()" << std::endl; + cout << "Starting incomingThread()" << std::endl; while (grabState) { // Read the next frame in from the camera and push it to back of queue - // std::cout << "debug:Pushing camera frame to queue" << std::endl; - vreader.grab(); + // cout << "debug:Pushing camera frame to queue" << std::endl; + if(!useVideoSocket) { + vreader.grab(); + } // Lodge clock for start of frame, after grabbing the frame but before decompressing/converting it. // This is as close as we can get to the shutter time, for a better match to inertial frame. frameNum++; iframe.vars.frameNum = frameNum; - iframe.vars.timeStamp = CLOCK() * 1e+6; + // Now process the grabbed frame - vreader.retrieve(iframe.mat); + + if(useVideoSocket) { + iframe.mat = raw_tcp_get_image(iframe.vars.timeStamp, iframe.vars.yaw); + } else { + iframe.vars.yaw = 1000; // Means: no yaw in image => use MAVLink provided yaw + vreader.retrieve(iframe.mat); + iframe.vars.timeStamp = CLOCK() * 1e+6; + } // Push the timeframe capsule onto the queue, ready for collection and processing incomingQueue.Push(iframe); @@ -230,7 +294,7 @@ void outgoingThread() pushState = true; // Outgoing frame MatCapsule oframe; - std::cout << "Starting outgoingThread()" << std::endl; + cout << "Starting outgoingThread()" << std::endl; while (pushState) { @@ -238,7 +302,7 @@ void outgoingThread() // Read frame in from the queue and push it to stream if (qsize > 0) { - std::cout << "debug:Pushing output frame to stream:" << qsize << std::endl; + //cout << "debug:Pushing output frame to stream:" << qsize << std::endl; // If the queue already has more than 1 frame then pop it, to prevent buildup // We only want to send the latest frame while (qsize > 1) @@ -257,16 +321,18 @@ void outgoingThread() int main(int argc, char **argv) { // Unbuffer stdout and stdin - std::cout.setf(std::ios::unitbuf); + cout.setf(std::ios::unitbuf); std::ios_base::sync_with_stdio(false); - // Make sure std::cout does not show scientific notation - std::cout.setf(ios::fixed); + // Make sure cout does not show scientific notation + cout.setf(ios::fixed); // Setup arguments for parser args::ArgumentParser parser("Track fiducial markers and estimate pose, output translation vectors for vision_landing"); args::HelpFlag help(parser, "help", "Display this help menu", {'h', "help"}); args::Flag verbose(parser, "verbose", "Verbose", {'v', "verbose"}); + args::Flag testFlag(parser, "test", "Start detecting markers immediately", {'t', "test"}); + args::ValueFlag getOffsets(parser, "markerid", "Compute marker offsets to the smalles marker center", {"get-offsets"}); args::ValueFlag markerid(parser, "markerid", "Marker ID", {'i', "id"}); args::ValueFlag dict(parser, "dict", "Marker Dictionary", {'d', "dict"}); args::ValueFlag output(parser, "output", "Output Stream", {'o', "output"}); @@ -289,7 +355,7 @@ int main(int argc, char **argv) } catch (args::Help) { - std::cout << parser; + cout << parser; return 0; } catch (args::ParseError e) @@ -305,53 +371,62 @@ int main(int argc, char **argv) return 1; } - if (!input || !calibration || !markersize) + if (!input || !calibration) { - std::cout << parser; + cout << parser; return 1; } - // Open camera - vreader.open( args::get(input) ); - - // Bail if camera can't be opened - if (!vreader.isOpened()) - { - std::cerr << "Error: Input stream can't be opened" << std::endl; - return 1; + if(input.Get().compare("raw-tcp") == 0) { + useVideoSocket = true; } - // Register signals - signal(SIGINT, handle_sig); - signal(SIGTERM, handle_sig); - signal(SIGUSR1, handle_sigusr1); - signal(SIGUSR2, handle_sigusr2); + if(useVideoSocket) { + init_video_socket(); + + } else { + // Open camera + vreader.open( args::get(input) ); + + // Bail if camera can't be opened + if (!vreader.isOpened()) + { + std::cerr << "Error: Input stream can't be opened" << std::endl; + return 1; + } + } // If resolution is specified then use, otherwise use default - int inputwidth = 640; - int inputheight = 480; if (width) inputwidth = args::get(width); if (height) inputheight = args::get(height); + frameSize = inputwidth * inputheight * 3; + // If fps is specified then use, otherwise use default int inputfps = 30; if (fps) inputfps = args::get(fps); - // If brightness is specified then use, otherwise use default - if (brightness) - vreader.set(CAP_PROP_BRIGHTNESS, args::get(brightness)); + if(!useVideoSocket) { + // If brightness is specified then use, otherwise use default + if (brightness) + vreader.set(CAP_PROP_BRIGHTNESS, args::get(brightness)); + + if(isCamera) { + // Set camera properties + vreader.set(cv::CAP_PROP_FRAME_WIDTH, inputwidth); + vreader.set(cv::CAP_PROP_FRAME_HEIGHT, inputheight); - // Set camera properties - vreader.set(cv::CAP_PROP_FRAME_WIDTH, inputwidth); - vreader.set(cv::CAP_PROP_FRAME_HEIGHT, inputheight); - vreader.set(cv::CAP_PROP_FPS, inputfps); - // vreader.set(CAP_PROP_BUFFERSIZE, 0); // Doesn't work yet with V4L2 + vreader.set(cv::CAP_PROP_FPS, inputfps); + // vreader.set(CAP_PROP_BUFFERSIZE, 0); // Doesn't work yet with V4L2 + } + } // Read and parse camera calibration data aruco::CameraParameters CamParam; + CamParam.readFromXMLFile(args::get(calibration)); if (!CamParam.isValid()) { @@ -362,18 +437,43 @@ int main(int argc, char **argv) // Take a single image and resize calibration parameters based on input stream dimensions // Note we read directly from vreader here because we haven't turned the reader thread on yet Mat rawimage; - vreader >> rawimage; + + if(useVideoSocket) { + uint64_t ts; + float yaw; + rawimage = raw_tcp_get_image(ts, yaw); + } else { + vreader >> rawimage; + } + + // TODO: Makes no sense, because if the resolution changes, the CamParam parameters also change. CamParam.resize(rawimage.size()); + cout << "\n"; // Separate storm of OpenCV / GStreamer init warnings + + load_config(); + + if(useApriltag) { + // Use existing aruco camera calibration parameters for the apriltag detector + apriltag_init(CamParam); + + if(getOffsets) { + extern int COMPUTE_OFFSETS; + COMPUTE_OFFSETS = args::get(getOffsets); + cout << "Computing offsets of marker #" << COMPUTE_OFFSETS << "...\n"; + + test = true; + } + } + // Calculate the fov from the calibration intrinsics const double pi = std::atan(1) * 4; - const double fovx = 2 * atan(inputwidth / (2 * CamParam.CameraMatrix.at(0, 0))) * (180.0 / pi); - const double fovy = 2 * atan(inputheight / (2 * CamParam.CameraMatrix.at(1, 1))) * (180.0 / pi); - std::cout << "info:FoVx~" << fovx << ":FoVy~" << fovy << ":vWidth~" << inputwidth << ":vHeight~" << inputheight << std::endl; - - // Turn on incoming thread - std::thread inThread(incomingThread); - MatTimeCapsule iframe; + const double fx = CamParam.CameraMatrix.at(0, 0); + const double fy = CamParam.CameraMatrix.at(1, 1); + const double fovx = 2 * atan(inputwidth / (2 * fx)) * (180.0 / pi); + const double fovy = 2 * atan(inputheight / (2 * fy)) * (180.0 / pi); + // cout << "info:FoVx~" << fovx << ":FoVy~" << fovy << ":vWidth~" << inputwidth << ":vHeight~" << inputheight << std::endl; + cout << "info: fx: " << fx << ", fy: " << fy << ", fovx: " << fovx << ", fovy: " << fovy << ", w: " << inputwidth << ", h: " << inputheight << std::endl; // If output specified then setup the pipeline unless output is set to 'window' if (output && args::get(output) != "window") @@ -393,6 +493,32 @@ int main(int argc, char **argv) return 1; } } + + /* + if(test) { + for(;;) { + cout << "Reading image...\n"; + Mat frame = raw_tcp_get_image(); + cout << "Writing image...\n"; + if(1) { + cv::imwrite("/var/www/html/out.png", frame); + } else { + cout << "Writing image...\n"; + for(int i = 0; i < 7; i++) { + vwriter.write(frame); + } + } + cout << "Ready\n"; + } + } + */ + + // Turn on incoming thread + + inThread = std::thread(incomingThread); + + MatTimeCapsule iframe; + // Turn on writer thread std::thread outThread(outgoingThread); MatCapsule oframe; @@ -435,12 +561,12 @@ int main(int argc, char **argv) markerSizes[atoi(_s.substr(0, _i).data())] = atof(_s.substr(_i + 1).data()); } // Debug print the constructed map - std::cout << "info:Size Mappings:"; + cout << "info:Size Mappings:"; for (const auto &p : markerSizes) { - std::cout << p.first << "=" << p.second << ", "; + cout << p.first << "=" << p.second << ", "; } - std::cout << std::endl; + cout << std::endl; // Setup marker thresholding uint32_t active_marker; @@ -457,13 +583,28 @@ int main(int argc, char **argv) { marker_history = 15; } - std::cout << "debug:Marker History:" << marker_history << std::endl; + cout << "debug:Marker History:" << marker_history << std::endl; uint32_t marker_threshold; marker_threshold = args::get(markerthreshold); - std::cout << "debug:Marker Threshold:" << marker_threshold << std::endl; + cout << "debug:Marker Threshold:" << marker_threshold << std::endl; // Print a specific info message to signify end of initialisation - std::cout << "info:initcomp:Initialisation Complete" << std::endl; + cout << "info:initcomp:Initialisation Complete" << std::endl; + + cout << "\n"; + + // Register signals just right before the main loop. Otherwise SIGINT won't work if video init hangs. + + signal(SIGINT, handle_sig); + if(testFlag || test) { + stateflag = true; + test = true; + + } else { + signal(SIGTERM, handle_sig); + signal(SIGUSR1, handle_sigusr1); + signal(SIGUSR2, handle_sigusr2); + } // Main loop while (true) @@ -471,7 +612,7 @@ int main(int argc, char **argv) // If signal for interrupt/termination was received, break out of main loop and exit if (sigflag) { - std::cout << "info:Signal Detected:Exiting track_targets" << std::endl; + cout << "info:Signal Detected:Exiting track_targets" << std::endl; break; } @@ -488,14 +629,14 @@ int main(int argc, char **argv) ScopedTimerEvents Timer("detectloop"); // Read a camera frame from the incoming thread, with 1 second timeout - // std::cout << "debug:Incoming queue size:" << incomingQueue.Size() << std::endl; + // cout << "debug:Incoming queue size:" << incomingQueue.Size() << std::endl; pkQueueResults res = incomingQueue.Pop(iframe, 1000); if (res != PK_QTS_OK) { if (verbose && res == PK_QTS_TIMEOUT) - std::cout << "debug:Time out reading from the camera thread" << std::endl; + cout << "debug:Time out reading from the camera thread" << std::endl; if (verbose && res == PK_QTS_EMPTY) - std::cout << "debug:Camera thread is empty" << std::endl; + cout << "debug:Camera thread is empty" << std::endl; this_thread::yield(); continue; } @@ -505,176 +646,200 @@ int main(int argc, char **argv) if (iframe.mat.empty()) continue; - // Detect markers - std::vector Markers = MDetector.detect(iframe.mat); - Timer.add("MarkerDetect"); + if(useApriltag) { + LandingTarget out; + if(apriltag_process_image(iframe.mat, CamParam, out)) { + double processtime = CLOCK() - framestart; + if(!test) cout << "target:" << out.id << ":" << out.x << ":" << out.y << ":" << out.z << ":" << processtime << ":" << iframe.vars.timeStamp << ":" << CLOCK() * 1e+6 << ":" << out.angle << ":" << iframe.vars.yaw << std::endl << std::flush; + } - // Order the markers in ascending size - we want to start with the smallest. - std::map markerAreas; - std::map markerIds; - for (auto &marker : Markers) - { - markerAreas[marker.getArea()] = marker.id; - markerIds[marker.id] = true; - // If the marker doesn't already exist in the threshold tracking, add and populate with full set of zeros - if (marker_history_queue.count(marker.id) == 0) + } else { + // Detect markers + std::vector Markers = MDetector.detect(iframe.mat); + Timer.add("MarkerDetect"); + + if(Markers.size()) { + cout << "Markers: " << Markers.size() << "\n"; + } + + // Order the markers in ascending size - we want to start with the smallest. + std::map markerAreas; + std::map markerIds; + for (auto &marker : Markers) { - for (unsigned int i = 0; i < marker_history; i++) - marker_history_queue[marker.id].push(0); + markerAreas[marker.getArea()] = marker.id; + markerIds[marker.id] = true; + // If the marker doesn't already exist in the threshold tracking, add and populate with full set of zeros + if (marker_history_queue.count(marker.id) == 0) + { + for (unsigned int i = 0; i < marker_history; i++) + marker_history_queue[marker.id].push(0); + } } - } - // Iterate through marker history and update for this frame - for (auto &markerhist : marker_history_queue) - { - // If marker was detected in this frame, push a 1 - (markerIds.count(markerhist.first)) ? markerhist.second.push(1) : markerhist.second.push(0); - // If the marker history has reached history limit, pop the oldest element - if (markerhist.second.size() > marker_history) + // Iterate through marker history and update for this frame + for (auto &markerhist : marker_history_queue) { - markerhist.second.pop(); + // If marker was detected in this frame, push a 1 + (markerIds.count(markerhist.first)) ? markerhist.second.push(1) : markerhist.second.push(0); + // If the marker history has reached history limit, pop the oldest element + if (markerhist.second.size() > marker_history) + { + markerhist.second.pop(); + } } - } - // If marker is set in config, use that to lock on - if (markerid) - { - active_marker = args::get(markerid); - // Otherwise find the smallest marker that has a size mapping - } - else - { - for (auto &markerArea : markerAreas) + // If marker is set in config, use that to lock on + if (markerid) { - uint32_t thisId = markerArea.second; - if (markerSizes[thisId]) + active_marker = args::get(markerid); + // Otherwise find the smallest marker that has a size mapping + } + else + { + for (auto &markerArea : markerAreas) { - // If the current history for this marker is >threshold, then set as the active marker and clear marker histories. Otherwise, skip to the next sized marker. - uint32_t _histsum = markerHistory(marker_history_queue, thisId, marker_history); - float _histthresh = marker_history * ((float)marker_threshold / (float)100); - if (_histsum > _histthresh) + uint32_t thisId = markerArea.second; + if (markerSizes[thisId]) { - if (active_marker == thisId) - break; // Don't change to the same thing - changeActiveMarker(marker_history_queue, active_marker, thisId, marker_history); - if (verbose) + // If the current history for this marker is >threshold, then set as the active marker and clear marker histories. Otherwise, skip to the next sized marker. + uint32_t _histsum = markerHistory(marker_history_queue, thisId, marker_history); + float _histthresh = marker_history * ((float)marker_threshold / (float)100); + if (_histsum > _histthresh) { - std::cout << "debug:changing active_marker:" << thisId << ":" << _histsum << ":" << _histthresh << ":" << std::endl; - std::cout << "debug:marker history:"; - for (auto &markerhist : marker_history_queue) + if (active_marker == thisId) + break; // Don't change to the same thing + changeActiveMarker(marker_history_queue, active_marker, thisId, marker_history); + if (verbose) { - std::cout << markerhist.first << ":" << markerHistory(marker_history_queue, markerhist.first, marker_history) << ":"; + cout << "debug:changing active_marker:" << thisId << ":" << _histsum << ":" << _histthresh << ":" << std::endl; + cout << "debug:marker history:"; + for (auto &markerhist : marker_history_queue) + { + cout << markerhist.first << ":" << markerHistory(marker_history_queue, markerhist.first, marker_history) << ":"; + } + cout << std::endl; } - std::cout << std::endl; + break; } - break; } } } - } - // If a marker lock hasn't been found by this point, use the smallest found marker with the default marker size - if (!active_marker) - { - for (auto &markerArea : markerAreas) + // If a marker lock hasn't been found by this point, use the smallest found marker with the default marker size + if (!active_marker) { - uint32_t thisId = markerArea.second; - // If the history threshold for this marker is >50%, then set as the active marker and clear marker histories. Otherwise, skip to the next sized marker. - uint32_t _histsum = markerHistory(marker_history_queue, thisId, marker_history); - float _histthresh = marker_history * ((float)marker_threshold / (float)100); - if (_histsum > _histthresh) + for (auto &markerArea : markerAreas) { - if (verbose) - std::cout << "debug:changing active_marker:" << thisId << std::endl; - changeActiveMarker(marker_history_queue, active_marker, thisId, marker_history); - break; + uint32_t thisId = markerArea.second; + // If the history threshold for this marker is >50%, then set as the active marker and clear marker histories. Otherwise, skip to the next sized marker. + uint32_t _histsum = markerHistory(marker_history_queue, thisId, marker_history); + float _histthresh = marker_history * ((float)marker_threshold / (float)100); + if (_histsum > _histthresh) + { + if (verbose) + cout << "debug:changing active_marker:" << thisId << std::endl; + changeActiveMarker(marker_history_queue, active_marker, thisId, marker_history); + break; + } } } - } - Timer.add("MarkerLock"); + Timer.add("MarkerLock"); - // Iterate through the markers, in order of size, and do pose estimation - for (auto &markerArea : markerAreas) - { - if (markerArea.second != active_marker) - continue; // Don't do pose estimation if not active marker, save cpu cycles - float _size; - // If marker size mapping exists for this marker, use it for pose estimation - if (markerSizes[markerArea.second]) - { - _size = markerSizes[markerArea.second]; - // Otherwise use generic marker size - } - else if (MarkerSize) - { - _size = MarkerSize; - if (verbose) - std::cout << "debug:defaulting to generic marker size: " << markerArea.second << std::endl; - } - // Find the Marker in the Markers map and do pose estimation. I'm sure there's a better way of iterating through the map.. - for (unsigned int i = 0; i < Markers.size(); i++) + // Iterate through the markers, in order of size, and do pose estimation + for (auto &markerArea : markerAreas) { - if (Markers[i].id == markerArea.second) + if (markerArea.second != active_marker) + continue; // Don't do pose estimation if not active marker, save cpu cycles + float _size; + // If marker size mapping exists for this marker, use it for pose estimation + if (markerSizes[markerArea.second]) { - MTracker[markerArea.second].estimatePose(Markers[i], CamParam, _size); + _size = markerSizes[markerArea.second]; + // Otherwise use generic marker size + } + else if (MarkerSize) + { + _size = MarkerSize; + if (verbose) + cout << "debug:defaulting to generic marker size: " << markerArea.second << std::endl; + } + // Find the Marker in the Markers map and do pose estimation. I'm sure there's a better way of iterating through the map.. + for (unsigned int i = 0; i < Markers.size(); i++) + { + if (Markers[i].id == markerArea.second) + { + MTracker[markerArea.second].estimatePose(Markers[i], CamParam, _size); + } } } - } - Timer.add("PoseEstimation"); + Timer.add("PoseEstimation"); - // Iterate through each detected marker and send data for active marker and draw green AR cube, otherwise draw red AR square - for (unsigned int i = 0; i < Markers.size(); i++) - { - double processtime = CLOCK() - framestart; - // If marker id matches current active marker, draw a green AR cube - if (Markers[i].id == active_marker) + // Iterate through each detected marker and send data for active marker and draw green AR cube, otherwise draw red AR square + for (unsigned int i = 0; i < Markers.size(); i++) { - if (output) + double processtime = CLOCK() - framestart; + // If marker id matches current active marker, draw a green AR cube + if (Markers[i].id == active_marker) { - Markers[i].draw(iframe.mat, Scalar(0, 255, 0), 2, false); + if (output) + { + Markers[i].draw(iframe.mat, Scalar(0, 255, 0), 2, false); + } + // If pose estimation was successful, calculate data and output to anyone listening. + if (Markers[i].Tvec.at(0, 2) > 0) + { + // Calculate vector norm for distance + double distance = sqrt(pow(Markers[i].Tvec.at(0, 0), 2) + pow(Markers[i].Tvec.at(0, 1), 2) + pow(Markers[i].Tvec.at(0, 2), 2)); + // Calculate angular offsets in radians of center of detected marker + double xoffset = (Markers[i].getCenter().x - inputwidth / 2.0) * (fovx * (pi / 180)) / inputwidth; + double yoffset = (Markers[i].getCenter().y - inputheight / 2.0) * (fovy * (pi / 180)) / inputheight; + if (verbose) + { + Ti = MParams.minSize; + ThresHold = MParams.ThresHold; + cout << "debug:active_marker:" << active_marker << ":center~" << Markers[i].getCenter() << ":area~" << Markers[i].getArea() << ":marker~" << Markers[i] << ":vectornorm~" << distance << ":Ti~" << Ti << ":ThresHold~" << ThresHold << ":Markers~" << Markers.size() << ":processtime~" << processtime << ":timestamp~" << iframe.vars.timeStamp << std::endl; + } + // This is the main message that track_targets outputs, containing the active target data + + // IX: get the rotation matrix of the marker + cv::Mat marker_rot; + cv::Rodrigues(Markers[i].Rvec, marker_rot); + + // IX: extract the rotation around the z-axis + double angle = atan2(marker_rot.at(1, 0), marker_rot.at(0, 0)); + + cout << "target:" << Markers[i].id << ":" << xoffset << ":" << yoffset << ":" << distance << ":" << processtime << ":" << iframe.vars.timeStamp << ":" << CLOCK() * 1e+6 << ":" << angle << ":1000" << std::endl << std::flush; + + // cout << "target:" << Markers[i].id << ":" << xoffset << ":" << yoffset << ":" << distance << ":" << processtime << ":" << CLOCK() * 1e+6 << std::endl << std::flush; + std::fflush(stdout); // explicitly flush stdout buffer + Timer.add("SendMessage"); + // Draw AR cube and distance + if (output) + { // don't burn cpu cycles if no output + drawARLandingCube(iframe.mat, Markers[i], CamParam); + CvDrawingUtils::draw3dAxis(iframe.mat, Markers[i], CamParam); + drawVectors(iframe.mat, Scalar(0, 255, 0), 1, (i + 1) * 20, Markers[i].id, xoffset, yoffset, distance, Markers[i].getCenter().x, Markers[i].getCenter().y); + Timer.add("DrawGreenAR"); + } + } else { + cout << "IX: pose estimation failed: " << Markers[i].Tvec.at(0, 2) << "\n"; + } + // Otherwise draw a red square } - // If pose estimation was successful, calculate data and output to anyone listening. - if (Markers[i].Tvec.at(0, 2) > 0) + else { - // Calculate vector norm for distance - double distance = sqrt(pow(Markers[i].Tvec.at(0, 0), 2) + pow(Markers[i].Tvec.at(0, 1), 2) + pow(Markers[i].Tvec.at(0, 2), 2)); - // Calculate angular offsets in radians of center of detected marker - double xoffset = (Markers[i].getCenter().x - inputwidth / 2.0) * (fovx * (pi / 180)) / inputwidth; - double yoffset = (Markers[i].getCenter().y - inputheight / 2.0) * (fovy * (pi / 180)) / inputheight; if (verbose) - { - Ti = MParams.minSize; - ThresHold = MParams.ThresHold; - std::cout << "debug:active_marker:" << active_marker << ":center~" << Markers[i].getCenter() << ":area~" << Markers[i].getArea() << ":marker~" << Markers[i] << ":vectornorm~" << distance << ":Ti~" << Ti << ":ThresHold~" << ThresHold << ":Markers~" << Markers.size() << ":processtime~" << processtime << ":timestamp~" << iframe.vars.timeStamp << std::endl; - } - // This is the main message that track_targets outputs, containing the active target data - std::cout << "target:" << Markers[i].id << ":" << xoffset << ":" << yoffset << ":" << distance << ":" << processtime << ":" << iframe.vars.timeStamp << ":" << CLOCK() * 1e+6 << std::endl << std::flush; - // std::cout << "target:" << Markers[i].id << ":" << xoffset << ":" << yoffset << ":" << distance << ":" << processtime << ":" << CLOCK() * 1e+6 << std::endl << std::flush; - std::fflush(stdout); // explicitly flush stdout buffer - Timer.add("SendMessage"); - // Draw AR cube and distance + cout << "debug:inactive_marker:center~" << Markers[i].getCenter() << ":area~" << Markers[i].getArea() << ":marker~" << Markers[i] << ":Ti~" << Ti << ":ThresHold~" << ThresHold << ":Markers~" << Markers.size() << ":processtime~" << processtime << std::endl; if (output) { // don't burn cpu cycles if no output - drawARLandingCube(iframe.mat, Markers[i], CamParam); - CvDrawingUtils::draw3dAxis(iframe.mat, Markers[i], CamParam); - drawVectors(iframe.mat, Scalar(0, 255, 0), 1, (i + 1) * 20, Markers[i].id, xoffset, yoffset, distance, Markers[i].getCenter().x, Markers[i].getCenter().y); - Timer.add("DrawGreenAR"); + Markers[i].draw(iframe.mat, Scalar(0, 0, 255), 2, false); + drawVectors(iframe.mat, Scalar(0, 0, 255), 1, (i + 1) * 20, Markers[i].id, 0, 0, Markers[i].Tvec.at(0, 2), Markers[i].getCenter().x, Markers[i].getCenter().y); + Timer.add("DrawRedAR"); } } - // Otherwise draw a red square - } - else - { - if (verbose) - std::cout << "debug:inactive_marker:center~" << Markers[i].getCenter() << ":area~" << Markers[i].getArea() << ":marker~" << Markers[i] << ":Ti~" << Ti << ":ThresHold~" << ThresHold << ":Markers~" << Markers.size() << ":processtime~" << processtime << std::endl; - if (output) - { // don't burn cpu cycles if no output - Markers[i].draw(iframe.mat, Scalar(0, 0, 255), 2, false); - drawVectors(iframe.mat, Scalar(0, 0, 255), 1, (i + 1) * 20, Markers[i].id, 0, 0, Markers[i].Tvec.at(0, 2), Markers[i].getCenter().x, Markers[i].getCenter().y); - Timer.add("DrawRedAR"); - } } + Timer.add("UseMarkers"); } - Timer.add("UseMarkers"); if (output && args::get(output) != "window") { @@ -698,21 +863,28 @@ int main(int argc, char **argv) sprintf(framestr, "info:avgframedur~%fms:fps~%f:frameno~%d:", avgdur(framedur), avgfps(), frameno++); if (verbose && (frameno % 100 == 1)) { - std::cout << framestr << std::endl; + cout << framestr << std::endl; } Timer.add("EndofLoop"); } // Stop incoming thread and flush queue - std::cout << "info:track_targets complete, exiting" << std::endl; + cout << "info:track_targets complete, exiting" << std::endl; grabState = false; pushState = false; while (PK_QTS_EMPTY != incomingQueue.Pop(iframe, 0)); inThread.join(); while (PK_QTS_EMPTY != outgoingQueue.Pop(oframe, 0)); outThread.join(); - vreader.release(); + + close_video_socket(); + + if(!useVideoSocket) { + vreader.release(); + } vwriter.release(); - std::cout.flush(); + + cout.flush(); + return 0; } diff --git a/vision_landing b/vision_landing index b5d9901..8ee7332 100755 --- a/vision_landing +++ b/vision_landing @@ -1,12 +1,22 @@ #!/usr/bin/env python2 # -*- coding: utf-8 -*- -# vision_landing -# https://github.com/fnoop/vision_landing -# -# This python script connects to arducopter using dronekit, to control precision landing. -# The control is purely visual using pose estimation from fiducial markers (eg. aruco or april tags), no additional rangefinder is needed. -# It launches a separate program track_targets which does the actual cv work and captures the resulting vectors from non-blocking thread handler. +""" +vision_landing +https://github.com/kripper/vision-landing-2 + +This python script connects to arducopter using dronekit, to control precision landing. +The control is purely visual using pose estimation from fiducial markers (eg. aruco or april tags), no additional rangefinder is needed. +It launches a separate program track_targets which does the actual cv work and captures the resulting vectors from non-blocking thread handler. + +TODO: +- Controller: + - Detect velocity using last measurements after eg. 2 [s] (to remove the transient) + - Measure total time for different durations (eg. 1 [s] and 2 [s]) to calculate the transient at start and end of motion. + - We asume the transient are the same of different durations + - This way, we can model the total time and velocity-output requried for a given distance and time +- Stop descending if marker is getting our of sight (drone should center before continue descending) +""" from threading import Thread, Event from Queue import LifoQueue, Queue, Empty @@ -24,12 +34,59 @@ from math import sqrt import signal import logging import ctypes +import math +import atexit + +import numpy as np +import sys + +useSmartLanding = True + +if useSmartLanding: + import os + dir_path = os.path.dirname(os.path.realpath(__file__)) + sys.path.append(dir_path + '/../SmartLanding/') + + #from smart_landing.drone_controller import DroneController, Target + from smart_landing.drone_controller import DroneController, Target, Simulator + + # TODO: Tune + cameraDelay = 5 + actionDelay = 2 + drone = DroneController(False, cameraDelay, actionDelay) + + if False: + sim = Simulator(cameraDelay, False) + sim.interactive = True + sim.renderAbsolute = True + drone.simulator = sim + else: + sim = None ### ================================ ### Define Functions ### ================================ +def printNoLF(str): + # TODO: Python 3: print(str, end="") + sys.stdout.write(str) + sys.stdout.flush() + +# Convert forward,right -> north,east +# r is clockwise in radians with 0 pointing north +# Use negative r to convert north,east -> forward,right +def rotate(forward,right, r): + c = math.cos(r) + s = math.sin(r) + return forward * c - right * s, forward * s + right * c + +# Get shortest angular direction between two angles +def getRotationDirection(alpha, beta): + phi = abs(beta - alpha) % 360 # This is either the distance or 360 - distance + distance = 360 - phi if phi > 180 else phi + return distance if (alpha - beta >= 0 and alpha - beta <= 180) or (alpha - beta <= -180 and alpha - beta >= -360) else distance * -1 + # Return correctly typed config parser option def get_config_value(parser, section, option): """Parses config value as python type""" @@ -74,6 +131,50 @@ def monotonic_time(): ### Define Classes ### ================================ +class Stats: + def __init__(self): + self.yaw = 0 + + def reset(self): + self.lastX = None + self.lastTime = None + self.stillCounter = 0 + self.waitUntilAtRest = False + self.lastTargetVisible = None + self.printTelemetry = False + + def compute(self, nowTime, targetVisible, north, east): + if self.lastTargetVisible is not None and targetVisible != self.lastTargetVisible: + if targetVisible: + print("Target visible") + else: + print("Target not visible") + + self.lastTargetVisible = targetVisible + + if not targetVisible: + self.reset() + return + + self.x = np.array([east, north]) + + if self.lastX is not None: + self.dx = self.x - self.lastX # Distance between samples + self.sec = (nowTime - self.lastTime) / 1e+9 # Time between samples + self.v = np.linalg.norm(self.dx) / self.sec # Velocity between samples + + if self.printTelemetry: + print("\tv: " + str(self.v)) + + self.yawRate = (self.yaw - self.lastYaw) / self.sec + + else: + self.dx = np.array([0,0]) + + self.lastX = self.x + self.lastTime = nowTime + self.lastYaw = self.yaw + # Threaded Reader class connects the output from the track_targets process to a non-blocking reader, and adds any messages to a queue. # The queue is then read by the main loop and if anything is waiting it calls a method to process from the track_targets object. class ThreadedReader: @@ -82,6 +183,7 @@ class ThreadedReader: self.stderr = stderr self.queue = LifoQueue() self._stop = Event() + def insertQueue(pipe, queue, qtype="stdout"): while not self.stopped(): self.clear() # Clear the queue before adding anything - we only want the latest data @@ -91,15 +193,18 @@ class ThreadedReader: if line and qtype == "stdout": queue.put(line) elif line and qtype == "stderr": - queue.put("error:"+line) + queue.put("error:" + line) + self.threadout = Thread(target = insertQueue, args = (self.stdout, self.queue)) self.threadout.daemon = True self.threadout.start() self.threaderr = Thread(target = insertQueue, args = (self.stderr, self.queue, "stderr")) self.threaderr.daemon = True self.threaderr.start() + def size(self): return self.queue.qsize() + def readline(self, timeout = None): try: line = self.queue.get(block = timeout is not None, timeout = timeout) @@ -108,17 +213,23 @@ class ThreadedReader: except Empty: log.debug("empty queue") return None + def clear(self): #with self.queue.mutex: # self.queue.clear() - while self.size() > 10: + + # TODO: BUG: Important line ("initcomp") is added and immediately removed without being processed, because of multiple OpenCV/Gstreamer initialization messages. + #while self.size() > 10: + while self.size() > 100: # self.readline() tmpline = self.queue.get(False) log.debug("clearing queue: {} : {}".format(self.size(), tmpline)) self.queue.task_done() return + def stop(self): self._stop.set() + def stopped(self): return self._stop.isSet() @@ -140,6 +251,7 @@ class TrackTime: self.t = timespec() # Setup listener decorator @self.vehicle.on_message("TIMESYNC") + def listener_timesync(subself, name, message): try: if message.ts1 < self.mytime_nanos: @@ -164,6 +276,7 @@ class TrackTime: except: if self.debug: log.debug("Error, unexpected timesync timestamp received:"+str(message.ts1)) + # Blocking method to call once during startup, to calculate delta def initial_sync(self): # Keep sending timesync requests until the delta buffer is full, or a timeout is reached @@ -176,9 +289,11 @@ class TrackTime: self.delta = sum(list(self.delta_buffer)) / len(list(self.delta_buffer)) else: log.warning("Error creating timesync delta buffer - unpredictable results will follow") + # Method to return monotonic clock time def current_timestamp(self): return monotonic_time() + # Request a timesync update from the flight controller def update(self, ts=0, tc=0): if ts == 0: @@ -188,17 +303,21 @@ class TrackTime: tc, # tc1 ts # ts1 ) + self.vehicle.send_mavlink(msg) self.vehicle.flush() + # Note: This returns the 'actual' time of the FC with the delay compensated from the last sync event def actual(self): return self.fctime_nanos + # Return actual + time elapsed since actual set def estimate(self): estmynanos = self.current_timestamp() - self.difference estfcnanos = self.actual() + (self.current_timestamp() - self.mytime_nanos) #log.debug("Estimate time: current-difference:"+str(estmynanos/1000000)+", fctime_nanos+elapsed:"+str(estfcnanos/1000000)+", difference:"+str((estmynanos - estfcnanos) / 1000000)) return estmynanos + def toFcTime(self, inTime): return inTime - self.difference @@ -215,9 +334,11 @@ class TrackTargets: # Launch tracking process and process output log.info("Launching track_targets with arguments:" + " ".join(self.track_arguments)) self.launch() + def setup(self, args, calibration): # Define the basic arguments to track_targets - self.track_arguments = [cwd+'/track_targets'] + self.track_arguments = [cwd + '/track_targets'] + # If marker dictionary is set, pass it through if args.markerdict: self.track_arguments.extend(['-d', args.markerdict]) @@ -259,18 +380,24 @@ class TrackTargets: self.track_arguments.extend(['--markerthreshold', str(args.markerthreshold)]) # Add positional arguments self.track_arguments.extend([args.input, calibration, str(args.markersize)]) + def launch(self): - self.process = Popen(self.track_arguments, stdin = PIPE, stdout = PIPE, stderr = PIPE, shell = False, bufsize=1) + global cwd + self.process = Popen(self.track_arguments, stdin = PIPE, stdout = PIPE, stderr = PIPE, shell = False, bufsize=1, cwd = cwd) self.reader = ThreadedReader(self.process.stdout, self.process.stderr) self.state = "initialising" + def processline(self, line): + #print("Processing line: " + line) + # Unpack data from track_targets data trackdata = line.rstrip().split(":") + # If not target data, log and skip to next data if trackdata[0] != "target": datatype = trackdata[0] if trackdata[0] == "error": - log.error("track_targets: " +str(trackdata[1:])) + log.error("track_targets: " + str(trackdata[1:])) elif trackdata[0] == "debug" and args.verbose: log.debug("track_targets: " + datatype[0].upper() + datatype[1:] + str(trackdata[1:])) elif trackdata[0] == "info": @@ -278,23 +405,60 @@ class TrackTargets: if trackdata[1] == "initcomp": self.state = "initialised" return + # Unpack and cast target data to correct types try: - [id,x,y,z,processtime,sensorTimestamp,sendTimestamp,receiveTimestamp] = trackdata[1:] - [id,x,y,z,processtime,sensorTimestamp,sendTimestamp,receiveTimestamp] = [int(id),float(x),float(y),float(z),float(processtime),float(sensorTimestamp),float(sendTimestamp),float(receiveTimestamp)] + [id,x,y,z,processtime,sensorTimestamp,sendTimestamp,relMarkerOrientation,receiveTimestamp,imgYaw] = trackdata[1:] + [id,x,y,z,processtime,sensorTimestamp,sendTimestamp,relMarkerOrientation,receiveTimestamp,imgYaw] = [int(id),float(x),float(y),float(z),float(processtime),float(sensorTimestamp),float(sendTimestamp),float(relMarkerOrientation),float(imgYaw),float(receiveTimestamp)] except: log.debug("track_targets: error in data: " + str(line.rstrip())) return + # If fake rangefinder distance is required, send it if args.fakerangefinder: craft.send_distance_message(int(z*100)) + # Send the landing_target message to arducopter # NOTE: This sends the current estimated fctime, NOT the time the frame was actually captured. # So this does not allow for the CV processing time. This should be sending when the frame is actually captured, before processing. nowTime = monotonic_time() + # If data is less than 250ms since the image shuttr, send the message. if nowTime - sensorTimestamp < 250 * 1e+6: - craft.send_land_message(x, y, z, tracktime.toFcTime(sensorTimestamp) / 1000, id) # Send sensor timestamp adjusted to timesync, in micros + if useSmartLanding: + # --- IX --- + + # PROBLEM: The marker coords are in forward-right coordinate frame (relative to the drone's yaw at the time of the image). + # If the drone rotates, the direction to the marker also rotates and is not valid anymore. + # SOLUTION: We will express directions in a drone's yaw invariant coordinate frame (local NED). + # We don't use a global coordinate frame, because the drone won't always know its global location and orientation. + # Local NED is not global, but relative to the initial takeoff coordinates. + + if imgYaw == 1000: + # track_targets is not providing the yaw with the image => use current yaw + yawRad = craft.vehicle.attitude.yaw + craft.stats.yaw = math.degrees(yawRad) + else: + yawRad = math.radians(imgYaw) + craft.stats.yaw = imgYaw + + # relMarkerOrientation: orientation of the marker (0° = top side of camera = "drone forward"; -90° = top side of marker is on the right side of the screen = "drone right") + # eg. if relMarkerOrientation = 0 and imgYaw 90° (drone is heading east) => markerOrientation is -90° (west). + + forward,right = -y,x + north,east = rotate(forward,right,yawRad) + + craft.processState(nowTime, True, sensorTimestamp, north, east, relMarkerOrientation, z) + + if sim is not None: + sim.obsDroneX = sim.obsDroneX + craft.outV + + else: + craft.send_land_message(x, y, z, tracktime.toFcTime(sensorTimestamp) / 1000, id) # Send sensor timestamp adjusted to timesync, in micros + + else: + print("Target timed out since: " + str((nowTime - sensorTimestamp) / 1e+9)) + # Track frame timing to calculate fps, running average over last 10 frames end_t = time() time_taken = end_t - self.start_t @@ -303,6 +467,7 @@ class TrackTargets: self.frame_times = self.frame_times[-10:] fps = len(self.frame_times) / sum(self.frame_times) log.info("Fctime:{:.0f}:{:.0f}, Fps:{:.2f}, Alt:{}, Rng:{}, Marker:{}, Distance:{}, xOffset:{}, yOffset:{}, processTime:{:.0f}, sensorTime:{:.0f}, sendTime:{:.0f}, receiveTime:{:.0f}, currentTime:{:.0f}, sensorDiff:{}, sendDiff:{}, receiveDiff:{}".format(tracktime.actual(), tracktime.estimate(), fps, craft.vehicle.location.global_relative_frame.alt, craft.vehicle.rangefinder.distance, id, z, x, y, processtime, sensorTimestamp, sendTimestamp, receiveTimestamp, nowTime, nowTime - sensorTimestamp, nowTime - sendTimestamp, nowTime - receiveTimestamp)) + def start(self): if self.state == "initialised": log.info("Requesting track_targets to start tracking:"+str(craft.vehicle.mode)) @@ -310,10 +475,12 @@ class TrackTargets: self.state = "started" else: log.info("Request to start track_targets tracking failed, state is not 'initialised'") + def stop(self): log.info("Requesting track_targets to stop tracking:"+str(craft.vehicle.mode)) self.state = "initialised" self.process.send_signal(signal.SIGUSR2) + def end(self): log.info("Shutting down track_targets") # Stop threaded reader @@ -324,6 +491,30 @@ class TrackTargets: self.process.terminate() self.shutdown = True + def waitImage(self): + printNoLF("Waiting for video streamer...") + + # Clear the vision tracking queue, we don't want to blast mavlink messages for everything stacked up + while True: + if self.process.poll() is not None: + print("\nERROR: track_targets is not running") + exit(1) + + line = self.reader.readline() + if not line: + sleep(0.1) + + elif not search("^target:", line): + if search("^error:", line): + log.error("track_targets: "+sub("^error:","",line)) + # Make sure to catch and process initcomp if it's in this initial queue + elif search("initcomp", line): + self.processline(line) + print("ok") + return + else: + log.info("track_targets: "+line) + # The Craft class connects to the flight controller and sends controlling messages class Craft: def __init__(self, connectstr): @@ -335,6 +526,373 @@ class Craft: self.precloiter_opt = self.find_precloiter_opt() if self.precloiter_opt: log.info("Precision loiter switch found: Channel "+str(self.precloiter_opt)) + + def startActions(self): + """ + In self.actions we define actions and states that will be exectued. + We can use self.nextState() to jump to next state. + + The format is: + + : [ + , + , + ... + ], + : [ + , + , + ... + ] + """ + self.actions = { + "land" : [ + "start" + ], + "measure" : [ + "yaw-rate-start", + "yaw-rate-run", + + "yaw-rate-start", + "yaw-rate-run", + + "yaw-rate-start", + "yaw-rate-run", + + "yaw-rate-start", + "yaw-rate-run", + + "yaw-rate-start", + "yaw-rate-run", + + "yaw-rate-start", + "yaw-rate-run", + + "yaw-rate-start", + "yaw-rate-run", + ] + } + """ + Here we are leaving the disabled actions: + + "measure" : [ + "stability-start", + "stability", + + "visual-lag-start", + "visual-lag", + + "velocities" + ] + """ + + self.visualLag = None # Meastured visual lag + self.count = 0 + + np.set_printoptions(suppress=True) + + self.actionIterator = iter(self.actions) + self._nextAction() + + # Init vars + self.stats.reset() + + def processState(self, nowTime, targetVisible, sensorTimestamp = None, north = None, east = None, markerOrientation = None, z = None): + if not craft.connected: + return + + self.lastDetection = nowTime + + self.stats.compute(nowTime, targetVisible, north, east) + + if self.action == "measure": + if targetVisible: + self.measure(nowTime, north, east, markerOrientation) + + elif self.action == "land": + self.land(nowTime, targetVisible, sensorTimestamp, north, east, markerOrientation, z) + + def waitAtRest(self): + print("Waiting until at rest...") + self.waitUntilAtRest = True + self.stillCounter = 0 + + def waitTransient(self): + print("Waiting until transient is over") + # TODO: Not implemented. + + def _nextAction(self): + self.action = next(self.actionIterator, None) + if self.action is None: + print("Ready") + exit(0) + else: + print "Action: " + self.action + self.stateIterator = iter(self.actions[self.action]) + self.nextState() + + def nextState(self): + self.state = next(self.stateIterator, None) + if self.state is None: + self._nextAction() + else: + print "\n--- State: " + self.state + + # --- Actions --- + + # Here we implement a sequence of tests to measure the drone dynamics. + def measure(self, nowTime, north, east, markerOrientation): + stats = self.stats + + if self.waitUntilAtRest: + if v < 0.05: + self.stillCounter += 1 + if self.stillCounter >= 10: + print("Drone is at rest") + self.waitUntilAtRest = False + else: + self.stillCounter = 0 + + elif self.state == "visual-lag-start": + # Perform a sudden movement that can be easily detected to measure the visual latency. + # TODO: Detect yaw rate? + # TODO: Measture accumulated distances instead of velocities. + print("\nMeasuring visual lag...") + self.motionDirection = x / np.linalg.norm(x) + self.startX = x + + self.testStartTime = nowTime + self.testEndTime = nowTime + 5 * 1e+9 + self.maxVel = 0 + self.maxVelTime = None + self.state = "visual-lag" + + # Sudden movement easy to detect + destVel = self.motionDirection * 0.2 + print("\nKicking") + self.move(destVel[1], destVel[0], 0, 0) + + elif self.state == "visual-lag": + projVel = -np.dot(stats.dx, self.motionDirection) # Velocity in projected direction + if projVel > self.maxVel: + self.maxVel = projVel + self.maxVelTime = nowTime + + if nowTime >= self.testEndTime: + if self.maxVelTime is None: + print("ERROR: Test failed => repeat") + self.state = "" + else: + self.visualLag = self.maxVelTime - self.testStartTime + drone.setCameraDelay(self.visualLag / 1e+9) + dist = np.linalg.norm(x - self.startX) + print("Visual lag: " + str(self.visualLag / 1e+9) + ", dist: " + str(dist)) + + self.count += 1 + if self.count >= 10: + self.nextState() + else: + # Repeat + self.state = "visual-lag-start" + + elif self.state == "velocities": + self.count = 0 + self.state = "velocity-start" + + elif self.state == "velocity-start": + self.vScalar = self.count * 0.05 + print("\nVelocity test: " + str(self.vScalar) + " [m/s]") + if self.vScalar > 0.5: + print("DANGER: Velocity to high?") + exit(1) + self.destVel = x / np.linalg.norm(x) * self.vScalar + self.startX = x + self.testStartTime = nowTime + self.state = "velocity-rest" + self.waitAtRest() + print("TODO: Remove transient") + + elif self.state == "velocity-rest": + self.state = "velocity-run" + self.testEndTime = nowTime + 2 * 1e+9 + # TODO: Do waitTransient() + + elif self.state == "velocity-run": + self.move(self.destVel[1], self.destVel[0], 0, 0) + + if nowTime >= self.testEndTime: + dist = np.linalg.norm(x - self.startX) + sec = (nowTime - self.testStartTime) / 1e+9 + avgVel = dist / sec + + print("Avg. velocity: " + str(avgVel)) + + if self.vScalar: + self.velFactor = v / self.vScalar + print("Velocity factor: ", self.velFactor) + + if self.count >= 4: + self.nextState() + else: + self.count += 1 + self.state = "velocity-start" + + elif self.state == "yaw-rate-start": + self.startYaw = math.degrees(self.vehicle.attitude.yaw) + self.testStartTime = nowTime + self.testEndTime = nowTime + 2 * 1e+9 + self.nextState() + + elif self.state == "yaw-rate-run": + currentYaw = math.degrees(self.vehicle.attitude.yaw) + deltaYaw = currentYaw - self.startYaw + # print("deltaYaw: " + str(deltaYaw) + ";" + str(markerOrientation)) + self.move(0, 0, 0, 5) + print("yaw: " + str(currentYaw) + ";" + str(markerOrientation)) + + if nowTime >= self.testEndTime: + self.nextState() + + elif self.state == "stability-start": + print("Measuring stability...") + self.positions = [] + self.testStartTime = nowTime + self.testEndTime = nowTime + 5 * 1e+9 + self.nextState() + + elif self.state == "stability": + self.positions.append(x) + + if nowTime >= self.testEndTime: + arr = np.array(self.positions) + + center = np.mean(self.positions, axis=0) + #std_dev = np.std(self.positions) + + minX = np.amin(arr[:, 0]) + maxX = np.amax(arr[:, 0]) + minY = np.amin(arr[:, 1]) + maxY = np.amax(arr[:, 1]) + + self.stability = { + 'errX': maxX - minX, + 'errY': maxY - minY + } + print(self.stability) + + self.nextState() + + def land(self, nowTime, targetVisible, sensorTimestamp, north, east, markerOrientation, z): + MAX_SQR_DIST = 0.2 **2 + + LOCKED_ALT_CENTER = 1 # Will only continue descending if we are centered + LOCKED_ALT_STAY = 0.7 # Will stay here + + testFactor = 2 # TODO: TEMP: + + LOCKED_SQR_DIST = 0.2 **2 + LOCKED_MAX_VEL = 0.1 * testFactor + LOCKED_MAX_YAW_RATE = 0.05 * testFactor + + # TODO: alt and z differe too much! Calibrate + alt = self.vehicle.location.global_relative_frame.alt + + if not self.gimbalIsDown: + print("Moving gimbal down...") + self.setServo(9, -90) + self.gimbalIsDown = True + + if not targetVisible: + craft.goBack() + return + + if self.state == "start": + self.state = "descend" + self.lockdownCounter = 0 + self.setServo(9, -90) + + elif self.state == "descend": + nortVel = 0 + eastVel = 0 + downVel = 0 + yawRate = 0 + + currentYaw = math.degrees(self.vehicle.attitude.yaw) + + if False: # TODO: Use SmartLanding + dv = drone.process(nowTime / 1e+9, sensorTimestamp / 1e+9, np.array([east, north])) + self.outV = self.outV + dv + nortVel = self.outV[1] + eastVel = self.outV[0] + + #print("dv: " + str(dv) + ", v: " + str(self.outV)) + + else: + # TODO: CHECK: Should be equivalent, but has some error + # TODO: Use getRotationDirection() ? + #dstYaw = currentYaw - 2 * pi + (-markerOrientation) + + dstYaw = currentYaw - markerOrientation + + yawRate = (dstYaw - currentYaw) * 0.5 + + factor = 0.15 + nortVel = north * factor + eastVel = east * factor + + # Down motion + dist = east **2 + north **2 + + # TODO: Also use z? + # TODO: Smooth with pid or other function + if dist < MAX_SQR_DIST: + if alt < LOCKED_ALT_CENTER: + downVel = 0 # Stay here unless dist < LOCKED_SQR_DIST + elif alt < 2: + downVel = 0.15 + else: + downVel = alt / 5 + + # Final landing + locked = False + if dist < LOCKED_SQR_DIST: + if alt < LOCKED_ALT_STAY: + downVel = 0 + + if self.stats.v < LOCKED_MAX_VEL and self.stats.yawRate < LOCKED_MAX_YAW_RATE: + locked = True + self.lockdownCounter += 1 + if self.lockdownCounter >= 20: + print("Drone stabilized. Proceed with final landing.") + self.send_nav_land_message() + + self.gimbalIsDown = False # If landing is interrupted, the gimbal needs to rotate down again + + sleep(10) # HACK: Avoid goBack() while landing + + if self.vehicle.armed: + print("ERROR: Landing failed (drone is still armed)..retrying") + else: + global doExit + doExit = True + print("Vehicle disarmed") + + if not locked and self.lockdownCounter > 0: + print("Drone moved: v: " + str(self.stats.v) + ", yawRate: " + str(self.stats.yawRate) + "\nLockdown counter reset") + + elif alt < LOCKED_ALT_CENTER: + downVel = 0.05 + else: + print("Drone not centered. distance: " + str(dist)) + + if not locked: + self.lockdownCounter = 0 + + if nortVel != 0 or eastVel != 0 or downVel != 0 or yawRate != 0: + print("alt: " + str(alt) + ", z: " + str(z) + ", D: " + str(downVel) + ", E: " + str(east) + ", N: " + str(north)) + + self.move(nortVel, eastVel, -downVel, yawRate) + def connect(self, connectstr): try: # self.vehicle = connect(connectstr, wait_ready=True, rate=1) @@ -345,6 +903,7 @@ class Craft: except Exception,e: log.critical("Error connecting to vehicle:"+str(repr(e))) self.connected = False + # Define function to send distance_message mavlink message for mavlink based rangefinder, must be >10hz # http://mavlink.org/messages/common#DISTANCE_SENSOR def send_distance_message(self, dist): @@ -362,9 +921,92 @@ class Craft: self.vehicle.flush() if args.verbose: log.debug("Sending mavlink distance_message:" +str(dist)) + + # Return to initial altitude + # TODO: Before that, try to return to last position where we saw the marker. If that fails, return to initial altitude and position. + def goBack(self): + alt = self.vehicle.location.global_relative_frame.alt + if alt > self.maxLandingAltitude: + print("Too high, going down") + upVel = -0.8 + self.move(0, 0, upVel, 0) + + elif self.initialAltitude < self.maxLandingAltitude and alt < self.initialAltitude: + print("Lost marker, going up") + upVel = 0.2 + self.move(0, 0, upVel, 0) + else: + # TODO: Loiter arround to find marker + print("Lost marker") + + def setServo(self, servo, val): + msg = self.vehicle.message_factory.command_long_encode( + 0, # time_boot_ms (not used) + 0, # target system, target component + mavutil.mavlink.MAV_CMD_DO_SET_SERVO, + 0, + servo, # RC channel... + 1500+(val*5.5), # RC value + 0, 0, 0, 0, 0) + self.vehicle.send_mavlink(msg) + self.vehicle.flush() + + def move(self, vx, vy, vz, yawRate, time_usec=0): + if args.verbose: + log.debug("Sending mavlink set_position_target_local_ned - time_usec:{:.0f}, dx:{}, dy:{}".format(time_usec, str(vx), str(vy))) + + msg = self.vehicle.message_factory.set_position_target_local_ned_encode( + 0, # TODO: Crashes with time_usec + + 0, # SysId + 0, # CompId + + mavutil.mavlink.MAV_FRAME_LOCAL_NED, # frame + 0b111, # mask (ignore x,y,z) + + 0, # x + 0, # y + 0, # z + + vx, # vx + vy, # vy + vz, # vz + + 0, # afx + 0, # afy + 0, # afz + + 0, # yaw + yawRate, # yaw rate + ) + self.vehicle.send_mavlink(msg) + self.vehicle.flush() + + def send_nav_land_message(self): + msg = self.vehicle.message_factory.command_long_encode( + 1, # system_id + 1, # component_id, + mavutil.mavlink.MAV_CMD_NAV_LAND, # command ID + 0, # confirmation + 0, 0, 0, 0, # parameters 1-4 (not used) + 0, # parameter 5: target altitude + 0, 0 # parameters 6-8 (not used) + ) + """ + param1=10, # descend time + param2=landing_target_msg.x, + param3=landing_target_msg.y, + param4=0, # desired yaw angle + param5=1, # target ID + param6=landing_target_msg.size_x, + param7=MAV_LANDING_TARGET_TYPE_LIGHT_BEACON + """ + self.vehicle.send_mavlink(msg) + self.vehicle.flush() + # Define function to send landing_target mavlink message for mavlink based precision landing # http://mavlink.org/messages/common#LANDING_TARGET - def send_land_message(self, x,y,z, time_usec=0, target_num=0): + def send_land_message(self, x, y, z, time_usec=0, target_num=0): msg = self.vehicle.message_factory.landing_target_encode( time_usec, # time target data was processed, as close to sensor capture as possible target_num, # target num, not used @@ -385,6 +1027,7 @@ class Craft: self.vehicle.flush() if args.verbose: log.debug("Sending mavlink landing_target - time_usec:{:.0f}, x:{}, y:{}, z:{}".format(time_usec, str(x), str(y), str(z))) + # Define function that arms and takes off, used for testing in SITL # Lifted from dronekit-python examples def arm_and_takeoff(self, targetAltitude): @@ -425,10 +1068,13 @@ class Craft: class SigTrack: def __init__(self): self.counter = 0 + def add(self): self.counter += 1 + def count(self): return self.counter + def handle_sig(self, signal, frame): log.info("Signal handler called, calling shutdown and cleanup logic") track_targets.end() @@ -438,6 +1084,11 @@ class SigTrack: log.warning("Signal handler called three times, exiting immediately") exit(1) + print("\nWaiting for children...") + track_targets.process.wait() + + raise SystemExit('Shuting down') # Interrupt lbocking methods in main thread (eg: self.find_precloiter_opt()) + ### ================================ ### End Define Classes ### ================================ @@ -511,10 +1162,16 @@ if not args.calibration or not args.markersize or not args.connect: # Setup logging console = logging.StreamHandler() + if args.logdir: # If logdir specified, create if it doesn't exist if not path.exists(args.logdir): makedirs(args.logdir) + + # Bug fix. See: https://stackoverflow.com/questions/30861524/logging-basicconfig-not-creating-log-file-when-i-run-in-pycharm + for handler in logging.root.handlers[:]: + logging.root.removeHandler(handler) + # Add timestamped logfile out logging.basicConfig(filename=args.logdir+'/vision_landing.'+nowtime+'.log', format='%(asctime)s %(levelname)s %(message)s', level=logging.DEBUG) # Create handly symlink to current logfile @@ -553,7 +1210,7 @@ except Exception as error: exit(1) track_targets.process.poll() if track_targets.process.returncode != None: - log.critical("Error starting track_targets: ".format(track_targets.process.returncode)) + log.critical("Error starting track_targets: "+format(track_targets.process.returncode)) exit(1) # Define a function that stops and clears up cv process @@ -568,6 +1225,9 @@ def cleanup(): except: pass +# Do a cleanup() on a crash. Otherwise track_targets will continue running. +atexit.register(cleanup) + ### -------------------------------- ### Start up procedure - setup signal handlers, ### connect to drone, start syncing time with drone, @@ -580,14 +1240,17 @@ signal.signal(signal.SIGINT, sigtracker.handle_sig) signal.signal(signal.SIGTERM, sigtracker.handle_sig) # Connect to the Vehicle -log.info("Connecting to drone on: %s" % args.connect) +log.info("Connecting to drone [%s]" % args.connect) +printNoLF("Connecting to drone [%s]..." % args.connect) craft = Craft(args.connect) if not craft.connected: + print("Error: Could not connect to drone") log.critical("Error: Could not connect to drone") cleanup() exit(1) else: log.info("Connected to drone") + print("ok") # Create a new TrackTime object to try and keep sync with the flight controller log.info("Starting time synchronisation with flight controller") @@ -596,7 +1259,9 @@ tracktime.debug = False # Perform the initial sync with the flight controller. This blocks until the sync is complete and good enough. while True: + printNoLF("Syncing time...") tracktime.initial_sync() + print("ok") if not tracktime.delta: log.warning("Timesync not supported by flight controller. THIS CAN LEAD TO DANGEROUS AND UNPREDICTABLE BEHAVIOUR.") break @@ -658,6 +1323,7 @@ if args.simulator: craft.vehicle.parameters['RNGFND_MAX_CM'] = 4000 craft.vehicle.parameters['RNGFND_PIN'] = 0 craft.vehicle.parameters['RNGFND_SCALING'] = 12.12 + # Take off to 25m altitude and start landing if not already armed if not craft.vehicle.armed: craft.arm_and_takeoff(100) @@ -665,19 +1331,38 @@ if args.simulator: log.info("Starting landing...") craft.vehicle.mode = VehicleMode("LAND") -# Clear the vision tracking queue, we don't want to blast mavlink messages for everything stacked up -while True: - line = track_targets.reader.readline() - if not line: - break - elif not search("^target:", line): - if search("^error:", line): - log.error("track_targets: "+sub("^error:","",line)) - # Make sure to catch and process initcomp if it's in this initial queue - elif search("initcomp", line): - track_targets.processline(line) - else: - log.info("track_targets: "+line) +# If we loose the target, move back to initial altitude +craft.initialAltitude = craft.vehicle.location.global_relative_frame.alt +craft.maxLandingAltitude = 5 # TODO: Get from config file +craft.lastDetection = 0 +craft.outV = np.array([0,0]) +craft.stats = Stats() +print("Initial altitude: " + str(craft.initialAltitude)) + +track_targets.waitImage() + +print("Take off") +if True: + targetAltitude = 1 + craft.vehicle.simple_takeoff(targetAltitude) + while True: + print("Altitude: " + str(craft.vehicle.location.global_relative_frame.alt)) + if craft.vehicle.location.global_relative_frame.alt >= targetAltitude * 0.90: + print("Reached target altitude") + break + sleep(1) + + craft.gimbalIsDown = False + +if False: + fps = 15 + print("Rotating") + for i in range(fps * 10): + craft.move(0, 0, 0, 50) + sleep(1.0 / 15) + +print("Start actions") +craft.startActions() ### -------------------------------- ### Main Loop @@ -685,8 +1370,14 @@ while True: ### Listen for incoming cv tracking results ### -------------------------------- log.info("Entering main tracking loop") +if args.controlprocessing: + print("Waiting LAND mode...") loop_counter = 0 -while True: +lastMsgTime = 0 + +doExit = False + +while not doExit: # Update timesync tracktime.update() @@ -712,6 +1403,12 @@ while True: log.debug("Main loop queue read empty, sleeping for a few ms") sleep(0.1) + now = monotonic_time() + elapsedSec = (now - craft.lastDetection) / 1e+9 + if elapsedSec > 1: + # Target timeout + craft.processState(now, False) + if args.simulator and not craft.vehicle.armed and not track_targets.shutdown: log.info("Landed") track_targets.end()