« Home « Kết quả tìm kiếm

Deep learning technique - based drone detection and tracking


Tóm tắt Xem thử

- DEEP LEARNING TECHNIQUE - BASED DRONE DETECTION AND TRACKING.
- Single Shot Detector (SSD) object detection algorithm and MobileNet-v2 architecture as the backbone were used for our experiments.
- Drone detection and tracking.
- SSD-MobileNet-v2.
- The development of drone surveillance systems is necessary and one of the most important request for drone detection and tracking system is real- time performance..
- To create a robust, efficient drone detection and tracking system, many researchs were presented.
- In this paper, we introduce an efficient drone detection and tracking algorithm by combining the SSD-MobileNet-v2 [5] object detection and Dlib correlation tracker [6] as well as centroid tracking algorithm [7].
- We first convert the video into a sequence of frames and set it as input of the trained model which was fine-tuned on the our synthetic drone dataset to recognize the drone in frames and achieve the bounding box around targets.
- The information of the targets bounding box is transmitted to Dlib correlation tracker and centroid tracker to track targets on sequence frames along with their ID..
- First, we introduce the object detection algorithm, the process from dataset prepare along with setting parameters for training to get the trained model as well as the Dlib correlation tracker and centroid tracker algorithms in Section II..
- We present our experiment results and evaluate the performance of our drone detection and tracking system in Section III.
- Object detection is an important task in computer vision applications.
- The SSD MobileNet-v2 model detector, and its drone detection capabilities, are analyzed and discussed in this paper..
- The SSD-MobileNet-v2 is divided into two parts, of which MobileNet is for object prediction, and Single Shot MultiBox Detector (SSD) is to determine the classification results [8].
- MobileNet-v2 works as a features extractor.
- Features are fed into the SSD network to determine the class and location of the detected objects on the captured images.
- The advantage of the SSD-MobileNet-v2 is that it provides a more balanced relationship of speed and accuracy when compared to other state-of-the-art models with similar network architecture such as YOLO and Faster-RCNN [9].
- The SSD-MobileNet-v2 model is a part of the Tensor-Flow Object Detection API and is modelled on the MS-COCO Dataset that consists of more than 300,000 images and 80 object classes but this dataset does not include object drone..
- Drone detection using SSD-MobileNet-v2 2.2.1.
- Dataset preparation is one of the most important step of deep learning model training process..
- Transfer Learning’s fine- tuning technique is used to re-train the model with the assistance of a custom dataset which includes 25,000 synthetic drone images that the original model was not trained on.
- The drone images were captured on video which presented the drone with various perspectives and angles along with pre-processed to get drone images consisting of multiple sizes of the drone on white backgrounds.
- The background images were collected from many sources with the aim of increasing the complexity of the environment in which the drone was operating.
- These tell our model where we placed the drone..
- Training process.
- In this paper the model SSD_MobileNet_v2_coco was used..
- Every frame in which the target is successfully tracked provides more information about the identity and the activity of the target [12].
- Dlib correlation tracker is widely used in image processing techniques for object tracking..
- Diagram of drone detection and tracking system..
- On the other hand, centroid tracking algorithm is used to track the centroid of the detected object for each subsequent video frame.
- The Euclidean distances between each pair of centroids are used to associate the new object’s centroid with the previous object’s centroid.
- This approach for object detection and tracking in a video is shown in fig.3..
- The detected objects from the SSD-MobileNet-v2 Drone Detector are treated initially as targets in the first frame.
- The coordinates of the object’s location are then updated based on the new location.
- The output of SSD-MobileNet-v2 based Drone Detector is a class of object and bounding box coordinates.
- The output of the object tracker is the aforementioned bounding box and tracked centroids as well as an object ID (multiple object detection case)..
- If the system that the object detector combines with is successfully coupled with an object tracking system [14] (object detection is not run on each individual frame) it can achieve a quicker overall process and therefore provides a more viable option for real time requests.
- The value of skip_frame is the period that drone detector is ran one time..
- To evaluate the performance of drone detector and drone detection and tracking system, the custom evaluation dataset is used.
- Single object detection and tracking (SOT).
- Video2.mp4 300.
- Multiple object detection and tracking (MOT).
- Video4.mp4 300.
- Test Drone detector.
- We compute the Intersection of the area of the predicted bounding box, and the area of the ground-truth bounding box, and divide by the Union of the two areas.
- For multiple object tracking performance evaluate, the Multiple Object Tracking Accuracy (MOTA) [15] is used..
- The small drone that we used for creating the training and testing data sets for the drone detector and drone detection and tracking system is a mini quadcopter drone.
- Drone Detection.
- The results showed that with the specific dataset, the value of batch_size and epoch number which controls the accuracy of the estimate of the error gradient when training neural networks are the important hyperparameters that influence the dynamics of the learning algorithm..
- Drone detection training results with diffrent setting parameters..
- Drone detector evaluation..
- We use the IOU (Intersection Over Union) value to evaluate the Drone detector with above setting confidence value.
- The fig.4 shows the Drone detection evaluation results, the aqua bounding box is ground truth box that is achieved by handcraft.
- the red bounding box is prediction box that is generated by drone detector with confidence value is set as 0.5..
- Tab.3 shows the average of IOU value.
- On the other hand, from fig.4, we can see that the confidence value and IOU value depended on the size of target when compare with the size of frames.
- When the target is small compared with the frame size (about 1/16 the size of frame), the IOU and confidence value are lower.
- When the target size is larger enough compared with the frame size, those values are higher..
- Drone detector test result..
- The Drone Detector was tested on images that were captured from previously unseen video footage.
- It can be seen that the Drone Detector effectively recognized small drones in strong light fig.5.(a), complex background fig.5.(b), drone fly close the trees fig.5.(c) and drone fly close the buildings fig.5.(d)..
- The combination of drone detection and tracking algorithm 3.2.1.
- shows the object detection and tracking with different Skip_frame value, the results are.
- We can see that, with the CPU configuration, different values of skip frame directly affect the accuracy and running speed of the system.
- For the GPU configuration, we can see that, different value of Skip_frame affect the running speed of the system and provide the same multi object tracking accuracy.
- The selection pair of Skip_frame value and the confidence threshold are important for improving the running speed while maintaining the detection and tracking accuracy..
- Drone detection and tracking with different Skip_frame value..
- For the GPU configuration, we can see that, different values of Skip_frame affect the running speed of the system and provide the same multi object tracking accuracy.
- The drone detection and tracking system was also tested on video and managed to detect a small drone with the use of a smart phone camera.
- Drone detection and tracking system test result..
- Fig.6(a) shows the tracking result without object detection, fig.6(b, c) show both the object detection and tracking results in different background condition, fig.6(d) shows the multiple object detection and tracking result..
- From these results we can see that when combining detection and tracking algorithms in a system, we can achieve the system with the improvement of system perform in both running speed and tracking accuracy..
- Single object tracking and multiple object tracking.
- Single object tracking..
- Video1.mp4.
- Tab.5 shows the single object tracking result when algorithms are run on both CPU and GPU configuration.
- We can see that, with the same input videos and setting parameters, the achieved of FPS value and tracking accuracy when the algorithm is ran on GPU configuration are higher than CPU configuration’s..
- Multiple object tracking..
- Video3.mp4.
- Tab.6 shows the multi object tracking result when algorithms are run on both CPU and GPU configuration.
- We can see that, with the same input videos and setting parameters, the achieved of FPS value and tracking accuracy when the algorithm is ran on GPU configuration are higher than CPU configuration’s.
- The result shows that the combination of object detection and object tracking algorithms provides an effective solution for real-time small drone detection and tracking.
- The system also performed good characteristic for handle multiple object tracking..
- In this paper, we present a drone detection and tracking system based on deep learning algorithms.
- The combination of object detection model and object tracking algorithm provides an effective solution for real-time small drone detection and tracking as well as handles multi-target.
- Firstly, the quality of the synthetic dataset that directly affect the performance of the whole system is needed to improve.
- Base on this, the dataset for create a multi-type of drone detection and tracking system will be expanded.
- Secondly, the research should involve the problem of data fusion where the information of camera-based drone detection will be associated to the information from other detection method such as radar-based method, acoustic-based method or RF-based method.
- Matson “Real-Time and Accurate Drone Detection in a Video with a Static Background”, Sensors .
- Chen, “Drone Detection and Tracking Based on Phase- Interferometric Doppler Radar”, 2018 IEEE Radar Conference..
- Dongkyu ’Roy’ Lee, Woong Gyu La, and Hwangnam Kim, “Drone Detection and Identification System using Artificial Intelligence”, 2018 International Conference on Information and Communication Technology Convergence (ICTC)..
- Mikulka, “Detection and Tracking of Moving UAVs”, 2019 Photonics Electromagnetics Research Symposium..
- Adrian Rosebrock, Simple object tracking with OpenCV, Available at:https://www.pyimagesearch.com simple-object-tracking-with opencv..
- Hashir Ali, Mahrukh Khursheed, Syeda Kulsoom Fatima, “Object Recognition for Dental Instruments Using SSD-MobileNet”, 2019 International Conference on Information Science and Communication Technology (ICISCT)..
- “Is Google Tensorflow Object Detection API the easiest way to implement image recognition.
- Available at: https://towardsdatascience.com/is-google-tensorflow-object- detection-api-the-easiest-way-to-implementimage-recognition-a8bd1f500ea0..
- Meedeniya, “Reinstating Dlib Correlation Human Trackers Under Occlusions in Human Detection based Tracking”, 2018 International Conference on Advances in ICT for Emerging Regions (ICTer.
- Lasitha Mekkayil, Hariharan Ramasangu, “Object Tracking with Correlation Filters using Selective Single Background”, arXiv:1805.03453v1 [cs.CV] 9 May 2018..
- Rainer, “Evaluating multiple object tracking performance: the clear mot metrics”, EURASIP J.
- Bài báo sử dụng kỹ thuật học chuyển tiếp (transfer learning) để huấn luyện lại mạng nơ-ron học sâu SSD-MobileNet-v2 trên tập dữ liệu nhân tạo, kết quả nhận dạng chính xác mục tiêu đạt được là 90.8%

Xem thử không khả dụng, vui lòng xem tại trang nguồn
hoặc xem Tóm tắt