On the one hand, models of sensorimotor interaction are embodied in the environment and in the interaction with other agents. On the other hand, recent Deep Learning development of Natural Language Processing (NLP) models allow to capture increasing language complexity (e.g. compositional representations, word embedding, long term dependencies). However, those NLP models are disembodied in the sense that they are learned from static datasets of text or speech. How can we bridge the gap from low-level sensorimotor interaction to high-level compositional symbolic communication? The SMILES workshop will address this issue through an interdisciplinary approach involving researchers from (but not limited to):
Organizers:
This Workshop is scheduled for Nov. 2nd and 3rd, check the schedule
The ability to perceive, understand and respond to social interaction in a human-like manner is one of the most desired skills for artificial agents. These sets of skills are highly complex and dependent on several different research fields, including affective understanding. An agent that can recognize, understand and, most importantly, adapt to different affective reactions from humans can increase its social capabilities by being able to interact and communicate naturally.
This Workshop has already been heldOrganizers:
Important dates:
This Workshop is scheduled for Oct. 30th, check the schedule