Embodiment, Sociality, Stuff is Hard (HRI reading response)

in design

“[O]ne might measure embodiment in terms of the complexity of the relationship between robot and environment over all possible interactions (i.e., all perturbatory channels).” - Fong, T., Nourbakhsh, I., Dautenhahn, K. (2003). A survey of socially interactive robots. Robots and Autonomous Systems, 42 143-166.

I was interested to see a concrete definition of embodiment as related to robots. I’m still thinking about what makes something a robot and not some other sort of computer. I have in my head, of course, an idea of a “typical” robot, but it’s based on fictional portrayals of robots and not what actual machines have historically been called “robots.” Embodiment seems to be an important part of it, but not the only part. And of course, embodiment means different things in different contexts. I’m familiar with talking about human embodiment from a psychological perspective, which is in many ways a very different idea than what people mean when they call robots “embodied.” So I appreciate seeing a definition that is clear and that I can apply directly to specific devices, like the example of “smart houses” we talked about a little in the very first class. Maybe it’s not so much that there are robots and non-robots, as there are devices with differing degrees of “robot-ness.” And of course, as the rest of the article discusses in such detail, there isn’t just one kind of robot, and maybe not even just one standard of “robot-ness.” The social robots discussed in the article don’t have very much in common with industrial robots.

So, I guess once you’ve decided that a thing is more like a robot than not, Fong, Nourbakhsh and Dautenhahn offe rpossible criteria for determining the “socialness” of the robot - though people have obviously approached that idea of socialness in very different ways, and it’s hard to say whether a “social interface” robot with a simulated face intended to convey very specific emotional responses, but that is only convincing for short interactions in limited situations, is more social than a “socially evocative” robot that has relatively simple and ambiguous responses to human actions, but that is appealing enough that people form long-term attachments to it.

Human social behavior is hard, basically. Social interaction is more or less what human brains are best specialized to carry out - I’ve heard evolutionary psychology theories suggesting that advantages produced by language and complex social interaction were the primary impetus for human brains to grow so large so quickly. It’s not exactly a well-tested theory, but I haven’t really heard any better ones. Successful social interaction depends on many things that we are currently not very good at getting computers to do, like learn over time or interpret ambiguous input. So, social robots have a long way to go.

(Check me out, rocking the APA-style citations even in blog posts. I was drilled relentlessly in APA style in undergrad, you bet I’m going to show it off now.)