Workshop on Novel Substrates and Models for the Emergence of Developmental, Learning and Cognitive Capabilities.
In order for sensing, reasoning, development and learning to take place in artificial systems/beings, computing substrates and models that support the emergence of such properties are required.
This workshop encompasses the understanding, analysis, modeling and development of novel substrates that show development and learning abilities, such as (but not limited to):
- Cellular automata and random boolean networks,
- Reservoir computing,
- Novel nanoscale materials (e.g. carbon nanotubes),
- Biological materials and systems (e.g. neuronal cultures),
- Evolvable hardware and neuromorphic computing systems,
- Novel neural models and models of biological neuron,
- Evolution-in-materio systems,
- Computational matter,
- Slime mould computing,
- Micro- and nano-scale electronic chemistry,
- Substrates that exhibit self-reconfiguration, fault tolerance, self-repair, and adaptation capabilities,
- Artificial generative and developmental systems,
- Computational intelligence techniques for novel developmental and learning substrates
- Learning through self-organization and emergence
This full-day workshop will investigate the sensorimotor and affective mechanisms that underlie human-robot interaction. Non-verbal and affective cues and expressions used to foster cooperation, mutual understanding and signal trustworthiness are manifested by humans all the time. If these cues are not appropriately reciprocated, however, the interaction can be negatively impacted. Moreover, inappropriate reciprocation, or lack thereof, may be the result of misperception and or non-timely reactions. Failure to adequately account for biologically plausible perceptual and temporal facets of interactions may detract from the quality of human-robot interaction and hinder progress in the field of social robotics more generally.
Incorporation of naturalistic and adaptive forms of sensorimotor and affective human-robot non-verbal communication is challenging because such interaction is highly dependent on the context and the relationship between the observer and the expresser. Biological species based interaction often requires explicit forms of social signalling such as nodding, nonverbal gestures, emotional expressions, etc., the interpretation of all of which may be highly context-sensitive. Furthermore, naturalistic social signalling may involve a certain degree of mimicry of autonomic responses such as pupil dilation, blinking, blushing, etc. which, in human-robot interaction requires the implementation of time-sensitive perceptual mechanisms currently underused in both commercial and research robotics platforms.
In this workshop, we will investigate and discuss to what extent the aforementioned naturalistic social signalling capabilities needs to be accounted for in human-robot interaction and what modalities are more relevant, and in what contexts. The workshop will focus strongly on research motivated by naturalistic empirical data. We hope to provide a discussion friendly environment to connect with research covering complementary interests in the areas of: robotics, computer science, psychology, neuroscience, affective computing and animal learning research.
The primary list of topics covers the following (but not limited to):
– Emotion recognition
– Gesture recognition
– Social gaze recognition
– The development of expression and recognition capabilties
– Joint visual attention and activity
– Alignment in social interactions
– Non-verbal cues in human-robot interaction
Our bodies constitute the interface between our mind and the surrounding world. To interact effectively with the world, babies and robots need to discover the different parts of their bodies, and their physical and mechanical properties. They need to know where their limbs are in space, how far they can extend, how they work, and what range of actions they can achieve with them. Such body representations are essential for developing a sense of the self and for defining the limits of the peripersonal space. The aim of this workshop is to bring together researchers in the fields of developmental psychology, developmental robotics, embodied cognition, body mapping, and computational modeling to create a productive, collaborative, and multidisciplinary forum from which different facets of these fundamental processes linked to body mapping and mapping the self will be explored. The workshop will be organized around three general themes with specific speakers and experts in each area:
- Body mapping: This theme will group developmental and learning studies in infants, adults, and robots on self-touch, self-exploration of the body, self-localization, and haptic perception to understand how the mapping of the self forms.
- Defining the peripersonal space: This theme will examine the range and conditions through which our peripersonal space develops, is being perceived, and being modulated through interactions with objects, people, and events in our daily surroundings. This theme will contrast infant, children, and adult studies as well as computational models.
- Sense of self and body ownership: This third theme will explore how our brain builds a map and sense of ownership of objects as extensions of our own body. Studies and models on tool use learning, prostheses mastery, including studies on the rubber hand illusion will provide a framework to understand how perception and action of inanimate objects can be experienced as functional extensions of our body and self.
Here, we offer an overview of developmental psychology theories and research methods. We will provide this through presentations, interactive break-out discussions and poster sessions. Our aim is to find a common language and groundwork to continue and foster collaborations in the interdisciplinary field of developmental robotics. We will provide a space in which no question would seem too basic, in order to help progress mutual understanding. The workshop is open for poster submissions by email to Thusha Rajendran: firstname.lastname@example.org. See the workshop webpage for more information.
ICDL-EpiRob 2019 Workshop on Personal Robotics and Secure Human-Robot Collaboration (Joint APRIL and SECURE ITN Symposium)
This workshop will focus on learning and interaction-based approaches to safe human-robot collaboration and their application to personal robotics. The workshop will include research topics like: developmental approaches to robot learning, safe interaction in uncertain environments, affect and emotion modelling for safe human-robot interaction, language and non-verbal communication etc.
Children acquire language in interactions with caregivers and peers in a socio-cultural environment. When children start to talk, their visual perception, body movement, navigation, object manipulation is already reaching some level of competence. Together with developing auditory control, initial schemas for social interactions (games), communication arises gradually and embedded into the social-interactionist environment. Even though there are various efforts in developmental robotics to model communication, the emergence of symbolic communication is still an unsolved problem. We are lacking convincing theories and implementations that show how cooperation and interaction knowledge could emerge in long-term experiments with populations of robotic agents.
The workshop will address the following issues
• role of context in language acquisition (cultural, social, interactional)
• role of every day interactions in language acquisition
• empirical paradigms for studying language acquisition and emergence
• longitudinal studies of language acquisition
• language grounding / embodiment / enaction
• modeling acquisition of syntax and/or semantics
• models of sentence and interaction processing
• modeling with artificial neural networks
• human robot interaction (especially linked with grounding, …)
• models of language emergence using Reinforcement Learning
• robot language acquisition