Resumen
Human?robot collaboration (HRC) is a new and challenging discipline that plays a key role in Industry 4.0. Digital transformation of industrial plants aims to introduce flexible production lines able to adapt to different products quickly. In this scenario, HRC can be a booster to support flexible manufacturing, thus introducing new interaction paradigms between humans and machines. Augmented reality (AR) can convey much important information to users: for instance, information related to the status and the intention of the robot/machine the user is collaborating with. On the other hand, traditional input interfaces based on physical devices, gestures, and voice might be precluded in industrial environments. Brain?computer interfaces (BCIs) can be profitably used with AR devices to provide technicians solutions to effectively collaborate with robots. This paper introduces a novel BCI?AR user interface based on the NextMind and the Microsoft Hololens 2. Compared to traditional BCI interfaces, the NextMind provides an intuitive selection mechanism based on visual cortex signals. This interaction paradigm is exploited to guide a collaborative robotic arm for a pick and place selection task. Since the ergonomic design of the NextMind allows its use in combination with the Hololens 2, users can visualize through AR the different parts composing the artifact to be assembled, the visual elements used by the NextMind to enable the selections, and the robot status. In this way, users? hands are always free, and the focus can be always on the objects to be assembled. Finally, user tests are performed to evaluate the proposed system, assessing both its usability and the task?s workload; preliminary results are very encouraging, and the proposed solution can be considered a starting point to design and develop affordable hybrid-augmented interfaces to foster real-time human?robot collaboration.