Autonomous Abilities

Aims

  • Keep the robot alive at all times,
  • Let application developer focus on their very specific content, without micro-programming robot day-to-day behavior.
// Build the holder for the ability.
val holder: Holder = HolderBuilder.with(qiContext)
        .withAutonomousAbilities(AutonomousAbilitiesType.BACKGROUND_MOVEMENT)
        .build()

// Hold the ability asynchronously.
holder.async().hold()

// Release the ability asynchronously.
holder.async().release()
// Build the holder for the ability.
Holder holder = HolderBuilder.with(qiContext)
        .withAutonomousAbilities(AutonomousAbilitiesType.BACKGROUND_MOVEMENT)
        .build();

// Hold the ability asynchronously.
holder.async().hold();

// Release the ability asynchronously.
holder.async().release();

Operating principle

Autonomous abilities are robot behaviors (movements, animations, tracking) occurring:

  • autonomously,
  • in the background,
  • taking the different resources of the robot weakly.

List of Autonomous Abilities

Autonomous Ability Type When
BackgroundMovement Passive Idle time
BasicAwareness Reactive Reacting to any kind of stimuli
AutonomousBlinking Reacting to human

Constraining Autonomous Abilities on given degrees of freedom

level_6

HolderBuilder.withDegreesOfFreedom allows you to define constraints restricting the Autonomous Abilities.

For instance, DegreeOfFreedom.ROBOT_FRAME_ROTATION prevents rotational movements of the robot base.

// Build the holder for the degree of freedom.
val holder = HolderBuilder.with(qiContext)
        .withDegreesOfFreedom(DegreeOfFreedom.ROBOT_FRAME_ROTATION)
        .build()

// Hold the ability asynchronously.
holder.async().hold()

// Release the ability asynchronously.
holder.async().release()
// Build the holder for the degree of freedom.
Holder holder = HolderBuilder.with(qiContext)
        .withDegreesOfFreedom(DegreeOfFreedom.ROBOT_FRAME_ROTATION)
        .build();

// Hold the ability asynchronously.
holder.async().hold();

// Release the ability asynchronously.
holder.async().release();

Note

Bear in mind that these constraints are only applied to Autonomous Abilities such as Basic Awareness, but not during EngageHuman action. However, LookAt provides something similar, called LookAtMovementPolicy, while EngageHuman does not allow such options.

For further details, see: LookAt - Setting the movement policy.

Prioritization

The prioritization system works as follows:

By default, the robot will make background movements (aka. idle movements) and blink. If the robot detects a stimulus, then he may move his head and body, overriding the background movements. Finally, if a high-level behavior such as a Say, an Animate or an EngageHuman is executed, this will take priority over the previous modules.

Future-proof thanks to API levels

Thanks to API level mechanism, potential new autonomous abilities becoming available on the robot will not modify the current behaviour of your application: new abilities will be activated only if you rebuild with the new QiSDK version.

Tips & Tricks

Prohibiting any movement

If I want the robot to keep a posture (e.g. at the end of an animation), BackgroundMovement and BasicAwareness must be held so the robot can remain perfectly still.

Deactivating body language while speaking or listening

If I want to freeze screen location while speaking or listening, I should disable the body language.

Note that, when using Chat and QiChatbot, you can disable body language option:

  • of the Chat to freeze while listening, and
  • of the QiChatbot to freeze while speaking.

For further details, see:

Getting started

See also