Social Robot Lab

@ Chapman University

About Us


Research

Social Robot Lab studies communication between humans and robots in various social contexts. We aim to perform innovative, quality research to generate knowledge and develop theories in human-robot communication and disseminate our findings to the academic community and the public.

Service

Social Robot Lab contributes to society through service-learning and community engagement. We aim to use social robots to help people lead happier, healthier, and more fulfilling lives. We also contribute to the Chapman University community by participating in various campus events.

Development

Social Robot Lab provides professional development opportunities for all members. Lab members will learn theories and technologies to build robots that are human-like, sociable, and intuitive to interact with. Lab members are also encouraged to pursue their own research and creativity projects.

Meet the Team


...

Austin Lee, Ph.D.

Associate Professor
Chapman University

...

Jake Liang, Ph.D.

Assistant Professor
Chapman University

...

Noel McGuire

Undergraduate Researcher
Chapman University

...

Nicole Yoo

Undergraduate Researcher
Chapman University

...

Audrey Shin

Graduate Researcher
Chapman University

...

Brian Katz

Gradate Researcher
Chapman University

...

Niloo Fathollahi

Undergradate Researcher
Chapman University

...

May Shabani

Undergradate Researcher
Chapman University

...

Hilary Lee

Undergradate Researcher
Chapman University

...

Alex Lewandowski

Undergradate Researcher
Chapman University

...

Thing 1

Nao Evolution V6
by Softbank Robotics

...

Thing 2

Nao Evolution V6
by Softbank Robotics

...

Casey

Telepresence Robot
by Double Robotics

...

Join the Team

Contact us at
info@socialrobotlab.com

News


Robots can Persuade People

A robot can effectively get people to do as the robot asks by capitalizing on the norm of reciprocity and following with a verbal request. We found that 60% of participants who were helped by a robot in a trivia game just for five minutes returned the robot’s favor by completing boring tasks for 15 minutes, no matter they liked the robot or not. This compliance rate was significantly higher than that of those who were not helped by the robot (33%).

Why Do People Comply?

Why do people treat robots and computers as if they were humans? Probably because people operate on a kind of autopilot when they interact with technology. In our study, a computer agent requested participants to do boring tasks. Mindless participants compiled when any kind of reason was given with the magic word “because.” Attentive participants complied only when a proper reason was given.

Fear of Robots

Using a nationally representative sample, we found that 26% of US population report a heightened level of fear toward autonomous robots and artificial intelligence. This fear was connected to sex, age, education, and household income, as well as media exposure to science fiction. It is also related to other types of fear, such as loneliness, becoming unemployed, and drone use.

Enhancing Human-Robot Trust

We examined the effects of user-generated content on human-robot trust as well as interaction outcomes. We found that reading online reviews about robot partners before interaction augmented the effects of human-robot trust on interaction outcomes. Specifically, it elicited positive mood and more favorable evaluations of the robot. This finding suggests that messages can be strategically deployed to enhance trust and interaction outcomes.

Press Coverage


From factories, offices and medical centers to our homes, cars and even our local coffee shops, robots and artificial intelligence are playing an increasingly greater role in how we work, live and play...

Cincinnati Public Radio
October 3, 2017

As social robots permeate wide segments of our daily lives, they may become capable of influencing our attitudes, beliefs, and behaviors. In other words, robots may be able to persuade us, and this is already happening...

It’s no secret that the College of Informatics has the most technologically advanced building on campus — Griffin Hall. But Griffin Hall is also home to Pineapple. Dr. Lee and Dr. Liang are using Pineapple to conduct studies on how humans interact socially when a robot is present...

NKU Inside
October, 2015

Between the studies on digital media production and computer sciences, Griffin Hall is home to some impressive technology. You might start seeing a new, digital face wandering the halls of the College of Informatics. That digital face is known as Pineapple, and it will be used for the next three years for research into social robotics...

The Northerner
October 19, 2015

Awards


Awards

Top Paper Award, National Communication Association, Communication and the Future Division (Lee & Liang, 2017)

Valerie Scudder Award: Jake Liang, Chapman University (2017)

Excellence in Research, Scholarship, and Creative Activity: Austin Lee, Northern Kentucky University (2017)

Faculty Research Excellence Award: Jake Liang, Chapman University (2016)

Top Paper Awards, International Communication Association, Instructional and Developmental Communication Division (Liang, 2015)

Grants

Pedagogical Innovations Grant: Austin Lee, Chapman University (2019)

Faculty Summer Fellowship: Austin Lee, Northern Kentucky University (2015)

Faculty Project Grant: Austin Lee, Northern Kentucky University (2015)

Research Grant: Jake Liang, Chapman University (2015)

Seed Grant: Austin Lee, Northern Kentucky University (2015)

Community Outreach


Research


Publications

Lee, S. A., & Liang, Y. (2019). A communication model of human-robot trust development for inclusive education. In J. Knox, Y. Wang, & M. S. Gallagher (Eds.), Speculative futures for artificial intelligence and educational inclusion (pp. 101-115). New York, NY: Springer Nature.

Lee, S. A., & Liang, Y. (2019). Robotic foot-in-the-door: Using sequential-request persuasive strategies in human-robot interaction. Computers in Human Behavior, 90, 351-356.

Lee, S. A., & Liang, Y. (2018). Theorizing verbally persuasive robots. In A. L. Guzman (Ed.), Human-machine communication: Rethinking communication, technology, and ourselves (pp. 119-143). New York, NY: Peter Lang.

Lee, S. A., & Liang, Y. (2018). Robotic foot-in-the-door: Using sequential-request persuasive strategies in human-robot interaction. Computers in Human Behavior.

Liang, Y., & Lee, S. A. (2017). Fear of autonomous robots: Evidence from national representative data with probability sampling. International Journal of Social Robotics, 9, 379-384.

Liang, Y., & Lee, S. A. (2016). Advancing the strategic messages affecting robot trust effect: The dynamic of user- and robot-generated content on human-robot trust and interaction outcomes. Cyberpsychology, Behavior, and Social Networking, 19, 538-544.

Lee, S. A., & Liang, Y. (2016). The role of reciprocity in verbally persuasive robots. Cyberpsychology, Behavior, and Social Networking. 19, 524-527.

Lee. S. A., & Liang, Y. (2015). Reciprocity in computer-human interaction: Source-based, norm-based, and affect-based explanations. Cyberpsychology, Behavior, and Social Networking, 18, 234-240.

Liang, Y., Lee, S. A., & Jang, J. (2013). Mindlessness and gaining compliance in computer-human interaction. Computers in Human Behavior, 29, 1572-1579.

Conference Presentations

Lee, S. A. (2019). Human-robot proxemics and compliance gaining. Paper presented at the 69th annual convention of the International Communication Association, Washington, DC.

Liang, Y., Lee, S. A., & Kee, K. F. (2019). The adoption of collaborative robots toward ubiquitous diffusion: A research agenda. Paper presented at the 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI 2019) Workshop, Daegu, South Korea.

Lee, S. A., Appelman, A., & Waldridge, Z. J. (2018). Robot message credibility. Paper presented at the 68th annual convention of the International Communication Association, Prague, Czech.

Lee, S. A., & Liang, Y. (2017). Theorizing verbally persuasive robots. Paper presented at the 103rd annual convention of the National Communication Association, Dallas, TX.

Lee, S. A., Liang, Y., & Thompson, A. M. (2017). Robotic foot-in-the-door: Using sequential-request persuasive strategies in human-robot interaction. Paper presented at the 67th annual convention of the International Communication Association, San Diego, CA.

Liang, Y., & Lee, S. A. (2016). Employing user-generated content to enhance human-robot interaction in a human-robot trust game. In Proceedings of the 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI 2016). Christchurch, New Zealand.

Lee, S. A., Liang, Y., & Cho, S. (2016). Effects of anthropomorphism and reciprocity in persuasive computer agents. Paper presented at the at the 102nd annual convention of the National Communication Association, Philadelphia, PA.

Cho, S., Lee, S. A., & Liang, Y. (2016). Using anthropomorphic agents for persuasion. Paper presented at the 66th annual convention of the International Communication Association, Fukuoka, Japan.

Contact Us


Contact info

Becket Building 111
Chapman University
Orange, CA 92866

info@socialrobotlab.com

@socialrobotlab

@socialrobotlab