Human


level_1


What is it

A Human represents a physical person detected by the robot. Pepper is able to focus on humans and can retrieve some of their characteristics.

How to use it

Getting humans around Pepper

Get access to the humans found by Pepper:

HumanAwareness humanAwareness = qiContext.getHumanAwareness();
List<Human> humansAround = humanAwareness.getHumansAround();

Getting the human position

The human position is available via the getHeadFrame method:

Human human = ...;
Frame headFrame = human.getHeadFrame();

Getting the human face picture

The human face picture with accompanying timestamp is available via the getFacePicture method:

Human human = ... ;
TimestampedImage timestampedImage = human.getFacePicture();
EncodedImage encodedImage = timestampedImage.getImage();
ByteBuffer imageData = encodedImage.getData();
Long time = timestampedImage.getTime();

The face picture and timestamp correspond to the last available image satisfying the filtering conditions:

  • Keep only images of front facing faces.
  • Discard blurry images.

Note that the byte array can be empty if there is no human face picture satisfying the filtering conditions, otherwise the last taken picture will be sent.

Pictures capture, before filtering, occurs with a frequency of 5Hz. The given image is an 8 bit encoded grayscale image with a minimum size of 25x25 pixels.

Retrieving characteristics

The Human object provides some human characteristics:

Human human = ...;
Integer age = human.getEstimatedAge().getYears();
Gender gender = human.getEstimatedGender();
PleasureState pleasureState = human.getEmotion().getPleasure();
ExcitementState excitementState = human.getEmotion().getExcitement();
SmileState smileState = human.getFacialExpressions().getSmile();
AttentionState attentionState = human.getAttention();
EngagementIntentionState engagementIntentionState = human.getEngagementIntention();

The available human characteristics are:

Characteristics Represented by … Based on … Comment
age an Integer facial features

Stabilization time required.

Takes minimum 1s to stabilize depending on the quality of measurements, so expect to have unknown values before.

gender Gender enum
smile state SmileState enum
mood PleasureState enum facial features, voice intonation, touch, speech semantics The PleasureState and ExcitementState characteristics can be combined together to determine the basic emotion of the human.
excitement state ExcitementState enum voice, movements
attention state AttentionState enum head orientation, gaze direction

These values represent the attention of the human during the previous second.

The attention state values are given relatively to the human perspective, i.e. when the human is looking to their right (from the user’s perspective), the value should be LOOKING_RIGHT.

engagement intention state EngagementIntentionState enum trajectory, speed, head orientation

This state describes the willingness of the human to interact with the robot.

Depending on the engagement intention of the human, various robot behaviors can be created to attract interested humans or directly start interacting with the ones who are proactively searching for an interaction.

Engaging a human

Make Pepper engage a Human so that it will focus on him/her:

Human human = ...;
EngageHuman engageHuman = EngageHumanBuilder.with(qiContext)
                                            .withHuman(human)
                                            .build();

engageHuman.async().run();

Performance & Limitations

Characteristics Refresh rate

The age, gender, smile and engagement intention characteristics have a refresh rate of 5Hz, while the other characteristics have a refresh rate of 1Hz.

Characteristics Refresh rate
Age 5Hz
Gender
SmileState
EngagementIntentionState
PleasureState 1Hz
ExcitementState
AttentionState

See also