Seminar by João Quintas
Human perception is heavily influenced by top-down predictions, making it more difficult to detect, or recognize, out-of-context objects than familiar ones, and there are numerous studies showing the priming effect of one concept on another (e.g. Glass and Holyoak, 1986). Human-machine interaction should evolve beyond the limits of conscious or direct inputs and be capable to perceive context as humans do and learn from shared experiences.
The aim of this work is to devise methods and algorithms for sharing context information, sensory information, and cognitive capabilities in heterogeneous systems and explore new approaches for human-robot interaction integrating contextual information.
The addressed problems relate with:
- In the human-machine interaction context there is not available yet a satisfactory framework to reason and learn based in context information in a distributed system, which involves sharing contextual information between different components of the system;
- Architecture, context modelling and algorithms are customized for each problem, thus it is difficult to evaluate the performance between different systems;
- Current systems use static and implicit representations for context preventing dynamic system adaptation through information sharing and learning.