ENTROPY-GUIDED TRACKER SWITCHING METHOD FOR UNMANNED AERIAL VEHICLE REAL-TIME TRACKING
Abstract
Background. Auto-guidance for unmanned aerial vehicles (UAVs) requires reliable real-time target tracking on resource-constrained onboard hardware. Modern state-of-the-art CNN-based and Transformer-based deep trackers provide strong accuracy but are often too slow and computationally expensive for continuous deployment on edge devices. In contrast, lightweight correlation-filter trackers run at high frame rates but can easily drift or lose the target because of occlusions or fast maneuvers. This robustness–efficiency trade-off (edge AI paradox) motivates adaptive strategies that balance accuracy, speed, and resource usage while preserving compute headroom for other onboard tasks.
Materials and methods. We propose an entropy-guided tracker switching method that combines a lightweight kernelized correlation filter (KCF) tracker augmented with Kalman motion prediction and a more accurate Siamese deep tracker. A motion-entropy scheduler quantifies the unpredictability of target motion using a normalized Shannon entropy over recent orientation changes. To avoid reacting to transient spikes, the entropy is exponentially smoothed, and threshold rules (with hysteresis) determine when KCF is sufficient and when to activate the deep tracker.
Results and Discussion. Experiments on UAV benchmarks (UAV123, OTB100) show that the hybrid tracker improves success AUC by ~10% over KCF and reaches about 70% of a Transformer tracker’s AUC while running 1.5–3× faster than always-on deep tracking. The switcher invokes the deep tracker only during difficult intervals, sustaining real-time operation (~100 FPS) and reducing average computation to ≈0.6 GFLOPs per frame versus ≈1–4 GFLOPs for purely deep tracking.
Conclusion. The proposed motion-entropy scheduler enables an adaptive trade-off between efficiency, speed, and accuracy. It maintains high tracking precision during target maneuvers and occlusions by temporarily switching to a robust tracker yet saves computational load during steady-motion periods. This framework offers a practical solution for high-performance UAV tracking on the edge, while leaving resource headroom to apply other improvement techniques.
Keywords: object tracking; correlation filters; Siamese network; motion entropy; hybrid tracker; edge computing.
Full Text:
PDFReferences
[1] Singh, A., Saini, K., Nagar, V., Aseri, V., Sankhla, M. S., Pandit, P. P., & Chopade, R. L. (2022). Artificial intelligence in edge devices. Advances in Computers, 127, 437–484. https://doi.org/10.1016/bs.adcom.2022.02.013
[2] Chen, X., Yan, B., Zhu, J., Wang, D., Wang, X., & Lu, H. (2021). Transformer tracking. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 8126–8135. https://doi.org/10.1109/CVPR46437.2021.00803
[3] Xue, Y., Jin, G., Shen, T., Tan, L., Yang, J., & Hou, X. (2022). MobileTrack: Siamese efficient single object tracker for high-speed UAV tracking. IET Image Processing, 16(12), 3300–3313. https://doi.org/10.1049/ipr2.12565
[4] Henriques, J. F., Caseiro, R., Martins, P., & Batista, J. (2015). High-speed tracking with kernelized correlation filters. IEEE Transactions on Pattern Analysis and Machine Intelligence, 37(3), 583–596.
https://doi.org/10.1109/TPAMI.2014.2345390
[5] Bolme, D. S., Beveridge, J. R., Draper, B. A., & Lui, Y. M. (2010). Visual object tracking using adaptive correlation filters. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2544–2550. https://doi.org/10.1109/CVPR.2010.5539960
[6] Wu, P., Li, Y., & Xue, D. (2025). UAV target tracking: A survey. Artificial Intelligence Review, 58(11), 1–41. https://doi.org/10.1007/s10462-025-11348-x
[7] Zhang, Y., Yang, Y., Zhou, W., Shi, L., & Li, D. (2018). Motion-Aware Correlation Filters for Online Visual Tracking. Sensors, 18(11), 3937. https://doi.org/10.3390/s18113937
[8] Kalal, Z., Mikolajczyk, K., & Matas, J. (2012). Tracking-learning-detection. IEEE Transactions on Pattern Analysis and Machine Intelligence, 34(7), 1409–1422. https://doi.org/10.1109/TPAMI.2011.239
[9] Cao, S., et al. (2025). UAV real-time target detection and tracking algorithm based on improved KCF and YOLOv5s_MSES. Machines, 13(5), 364. https://doi.org/10.3390/machines13050364
[10] Ai, Y., et al. (2025). Real-time occluded target cooperative tracking method for UAVs. Electronics, 14(20), 4034. https://doi.org/10.3390/electronics14204034
[11] Wang, J., Liu, Y., Ai, Y., & Xue, W. (2021). Long-term target tracking combined with re-detection. EURASIP Journal on Advances in Signal Processing, 2021, 79. https://doi.org/10.1186/s13634-020-00713-3
[12] Ma, H., Acton, S. T., & Lin, Z. (2020). SITUP: Scale invariant tracking using average peak-to-correlation energy. IEEE Transactions on Image Processing, 29, 3546–3557. https://doi.org/10.1109/TIP.2019.2962694
[13] Liu, F., Mao, K., Qi, H., & Liu, S. (2019). Real-time long-term correlation tracking by single-shot multibox detection. Optical Engineering, 58(1), 013105. https://doi.org/10.1117/1.OE.58.1.013105
[14] Shannon, C. E. (1948). A mathematical theory of communication. Bell System Technical Journal, 27(3), 379–423. https://doi.org/10.1002/j.1538-7305.1948.tb01338.x
[15] Jaynes, E. T. (1957). Information theory and statistical mechanics. Physical Review, 106(4), 620–630. https://doi.org/10.1103/PhysRev.106.620
[16] Shi, J., & Tomasi, C. (1994). Good features to track. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), 593–600. https://doi.org/10.1109/CVPR.1994.323794
[17] Chen, C.-Y., et al. (2008). Motion entropy feature and its applications to event-based segmentation of sports video. EURASIP Journal on Image and Video Processing, 2008, 460913. https://doi.org/10.1155/2008/460913
[18] Mueller, M., Smith, N., & Ghanem, B. (2016). A benchmark and simulator for UAV tracking. European Conference on Computer Vision (ECCV), 445–461. https://doi.org/10.1007/978-3-319-46448-0_27
[19] Wu, Y., Lim, J., & Yang, M.-H. (2015). Object tracking benchmark. IEEE Transactions on Pattern Analysis and Machine Intelligence, 37(9), 1834–1848. https://doi.org/10.1109/TPAMI.2014.2388226
[20] Bertinetto, L., Valmadre, J., Henriques, J. F., Vedaldi, A., & Torr, P. H. S. (2016). Fully-convolutional siamese networks for object tracking. European Conference on Computer Vision Workshops (ECCVW), 850–865.
https://doi.org/10.1007/978-3-319-48881-3_56
[21] Ye, B., Chang, H., Ma, B., Shan, S., & Chen, X. (2022). Joint feature learning and relation modeling for tracking: A one-stream framework. Computer Vision – ECCV 2022, 341–357. https://doi.org/10.1007/978-3-031-20047-2_20
[22] Cui, Y., Jiang, C., Wu, G., & Wang, L. (2024). MixFormer: End-to-end tracking with iterative mixed attention. IEEE Transactions on Pattern Analysis and Machine Intelligence. https://doi.org/10.1109/TPAMI.2024.3349519
[23] Javed, S., Danelljan, M., Khan, F. S., Khan, M. H., Felsberg, M., & Matas, J. (2023). Visual Object Tracking With Discriminative Filters and Siamese Networks: A Survey and Outlook. IEEE Transactions on Pattern Analysis and Machine Intelligence, 45(5), 6552–6574. https://doi.org/10.1109/TPAMI.2022.3212594
DOI: http://dx.doi.org/10.30970/eli.33.10
Refbacks
- There are currently no refbacks.

Electronics and information technologies / Електроніка та інформаційні технології