Redirigiendo al acceso original de articulo en 23 segundos...
Inicio  /  Applied Sciences  /  Vol: 13 Par: 14 (2023)  /  Artículo
ARTÍCULO
TITULO

Real-Time Information Fusion System Implementation Based on ARM-Based FPGA

Yu-Hsiang Tsai    
Yung-Jhe Yan    
Meng-Hsin Hsiao    
Tzu-Yi Yu and Mang Ou-Yang    

Resumen

In this study, an information fusion system displayed fusion information on a transparent display by considering the relationships among the display, background exhibit, and user?s gaze direction. We used an ARM-based field-programmable gate array (FPGA) to perform virtual?real fusion of this system as well as evaluated the virtual?real fusion execution speed. The ARM-based FPGA used Intel® RealsenseTM D435i depth cameras to capture depth and color images of an observer and exhibit. The image data was received by the ARM side and fed to the FPGA side for real-time object detection. The FPGA accelerated the computation of the convolution neural networks to recognize observers and exhibits. In addition, a module performed by the FPGA was developed for rapid registration between the color and depth images. The module calculated the size and position of the information displayed on a transparent display according to the pixel coordinates and depth values of the human eye and exhibit. A personal computer with GPU RTX2060 performed information fusion in ~47 ms, whereas the ARM-based FPGA accomplished it in 25 ms. Thus, the fusion speed of the ARM-based FPGA was 1.8 times faster than on the computer.