University of Louisville

Detection and Tracking with Philip K. Dick (PKD) Android Robot

Grade Level at Time of Presentation

Junior

Major

Computer Science & Engineering

Minor

Entrepreneurship

Institution

University of Louisville

KY House District #

41

KY Senate District #

33

Department

Computer Science & Engineering

Abstract

As technology grows tenfold, the necessity for establishing seamless interactions between humans and robots also grows in demand. In the Louisville Automation & Robotics Research Institute (LARRI) at the University of Louisville, we focus on all tiers of robotics ranging from service to social robotics. Philip K. Dick (PKD) is an android robot with an expressive face. The robot has actuators that can be manipulated to perform any facial expression. In the work presented here, we aimed to enable the android robot to track objects through the neck-eye coordination of the robot. This will allow the robot to maintain eye contact during conversation and track any objects.

We have implemented an algorithm on the PKD to track objects with AR Tags through an Xbox Kinect and Python scripts. The Kinect camera was external to the PKD setup and was able to relay necessary vision data for processing and robot control. We were able to accomplish object detection, translation of pixel dimensions to real-world coordinates, and corresponding head-tracking movements on PKD. The constant bidirectional communication between the Kinect’s reading of the AR Tag and PKD’s head servo motors was made possible by using a Publisher-Subscriber methodology in the Robot Operating Software (ROS) framework. As the controller receives the object coordination data, it computes the gaze direction of the PKD robot. Once the gaze directions were calculated, the motor controllers were commanded to move the robot head in that direction.

In conclusion, this seamless tracking addition onto PKD’s basic functionalities is just the beginning of researching further into the phenomenon known as the Uncanny Valley to test the robot’s resemblance to a human being and its acceptability. The above work is presented in an effort to enable the android head to be able to mimic human-like motion.

This document is currently not available here.

Share

COinS
 

Detection and Tracking with Philip K. Dick (PKD) Android Robot

As technology grows tenfold, the necessity for establishing seamless interactions between humans and robots also grows in demand. In the Louisville Automation & Robotics Research Institute (LARRI) at the University of Louisville, we focus on all tiers of robotics ranging from service to social robotics. Philip K. Dick (PKD) is an android robot with an expressive face. The robot has actuators that can be manipulated to perform any facial expression. In the work presented here, we aimed to enable the android robot to track objects through the neck-eye coordination of the robot. This will allow the robot to maintain eye contact during conversation and track any objects.

We have implemented an algorithm on the PKD to track objects with AR Tags through an Xbox Kinect and Python scripts. The Kinect camera was external to the PKD setup and was able to relay necessary vision data for processing and robot control. We were able to accomplish object detection, translation of pixel dimensions to real-world coordinates, and corresponding head-tracking movements on PKD. The constant bidirectional communication between the Kinect’s reading of the AR Tag and PKD’s head servo motors was made possible by using a Publisher-Subscriber methodology in the Robot Operating Software (ROS) framework. As the controller receives the object coordination data, it computes the gaze direction of the PKD robot. Once the gaze directions were calculated, the motor controllers were commanded to move the robot head in that direction.

In conclusion, this seamless tracking addition onto PKD’s basic functionalities is just the beginning of researching further into the phenomenon known as the Uncanny Valley to test the robot’s resemblance to a human being and its acceptability. The above work is presented in an effort to enable the android head to be able to mimic human-like motion.