The Actuation service contains the actions and objects needed to make the robot move:
- the robot and gaze frames
- the factory for Animation objects
- the factories for GoTo, Animate and LookAt actions
Structure representing an age.
Action to play animations on the robot.
Build a new Animate
Object representing a robot animation. An animation can be composed of gestures performed by the robot limbs, head, and/or trajectories performed by the robot base.
Build a new Animation
Interface for objects that can provide an AnyObject.
Parent class for QiService objects that run on the tablet.
Converter for AnyObjectWrapper objects.
Action to make the robot go towards a human and respond to various situations on the way.
Build a new ApproachHuman
Object representing a frame attached to a parent frame. The link between the parent and the attached frame, i.e. the relative location of the attached frame to its parent, is editable. In order to compute transforms between frames, one should use the frame() function of an AttachedFrame.
Enum containing the possible attention states of the human when interacting with the robot. States are defined based on where the human is looking, the frame of reference is the human.
A service that allow to select the desired autonomous abilities to pause or resume during my activity. This service will ensure the ability owner that he or she can be the only one to control this ability as long as his or her holder is not released. Holding an ability will automatically pause it. To resume it, the owner must release it.
An AutonomousAbilityHolder represents an autonomous ability being taken from the AutonomousAbilities service. It serves only once, and emits released() whenever the autonomous ability is released. An AutonomousAbilityHolder that was released is invalid.
A reaction Suggested by a Chatbot
Additional information on the importance of a suggested ChatbotReaction
Decribes the validity of a suggested ChatbotReaction
Parent class for ChatBot implementations.
Parent class for ChatBotReaction implementations.
Parent class for QiChatExecutor implementations.
Body language policy.
Object representing a marked location in a topic.
Object representing the state of a Bookmark during Discuss execution.
Service exposing actions and properties related to the robot camera.
Action that listens to the users and interrogates its Chatbots to select the most appropriate answers.
Object representing a Chatbot that can react in response to Phrases.
Action produced by a Chatbot either as a reply to a Phrase, or spontaneously
Describes the current status of a ChatbotReaction (in a Chat action for example)
Build a new Chat
Optional parameters for the configuration of a Chat action
Service exposing actions and properties related to human-robot conversation.
An object collecting Conversation-related signals and properties for a given application context
Struct representing a date as a string.
Struct representing a dateTime as a string.
A degree of freedom.
Action to make the robot able to converse with a human using content from QiChat topics.
Build a new Discuss
Object allowing to edit a named graph.
A container of Phrases that can be edited.
Object containing the emotional state properties. It is a three-dimensional representation of the emotional state, based on the PAD model of Albert Mehrabian. See: Mehrabian, Albert (1980). Basic dimensions for a general psychological theory.
Encoded Image
Encapsulates an EncodedImage. This object enables to share EncodedImage data while delaying the copy to the most appropriate time.
Action to limit robot movements in order to ease user interaction on the tablet. The robot will put the tablet at a suitable position then emit the positionReached() signal, and prevent further leg and base movements, while also ensuring that the arms movements do not bring them in front of the tablet.
Build a new EnforceTabletReachability
Action to make the robot look at a human and keep eye contact.
Build a new EngageHuman
Enum containing the engagement intention of the human toward the robot, as perceived.
Eye contact policy.
Convert QiEnum
from and to raw AnyObject
.
Enum containing the perceived energy in the emotion.
Object encapsulating the data needed by the robot to localize himself inside his environment.
Build a new ExplorationMap
Structure containing expression data computed from human's face.
Object representing a flap, that may be open or closed.
Description of a flap sensor state.
A service tracking the current focus, and guaranteeing that only one client has the focus at the same time. The focus is required for actions to be performed on the robot. This mechanism ensures the focus owner that it can be the only one to control the robot as long as its FocusOwner is not released.
A FocusOwner represents a focus being taken from the focus service. It serves only once, and emits released() whenever the focus is released. A FocusOwner that was released is invalid.
Object representing the location associated with an object or a person. This location is likely to change over time, for example when a person moves. Transforms can be computed out of two frames, at a given time, to get the relative position and orientation between two objects. If the robot is localized using external sensors, the transform between two frames can be computed with odometry drift compensation.
Object representing a reference frame free to be placed anywhere, that does not move when other frames move. The global position of a free frame can be updated by giving its location at a given time in a given reference frame. In order to compute transforms between frames, one should use the frame() function of a FreeFrame. The FreeFrame will be invalid right after creation until first update.
A Consumer
use to automatically log future error or cancel
Utility methods used for work with futures
Enum containing different genders of a human.
Action to make the robot go somewhere. The destination is represented by a target frame. The robot will try to reach safely the 2D location corresponding to the target frame while avoiding obstacles. The robot may look around and follow non-straight paths, in order to choose the safest way towards the target.
Build a new GoTo
Configuration parameters of a GoTo action. If a parameter is not set by the user, it is up to the action to set it to its default behavior.
Build a new Holder
Object representing a physical person detected by the robot.
Service exposing actions and properties related to human-robot interaction.
Utility methods used for work raw files and assets
//! Service handling the shared knowledge.
Object allowing to read data from given named graphs.
All the possible languages
Language utility class.
Action to make the robot listen to and recognize a specific set of phrases pronounced by a user. On recognition success, a ListenResult gives the heard phrase and the matching PhraseSet.
Build a new Listen
Optional parameters for the configuration of a Listen action
The heard phrase along with the matching phrase set.
A locale
Localization process status
Action to make the robot localize himself in a map, previously built by a LocalizeAndMap action. Only one LocalizeAndMap or Localize action can run at a time. When run, the robot first executes a rotating base and head scan. During this initial scan, the action needs all the robot resources to run, and the action status is `Scanning`. After this scan, the robot is localized inside his map, and his position and orientation relatively to the map origin can be retrieved, at any time, by calling `Actuation.robotFrame().computeTransform(Mapping.mapFrame())` While the action is running, the robot may autonomously look around to confirm his location.
Action to make the robot explore an unknown environment, while localizing himself inside it and building a representation of this environment, known as an exploration map. Only one action among LocalizeAndMap or Localize can run at a time. When run, the robot first executes a rotating base and head scan. During this initial scan, the action needs all the robot resources to run and the action status is `Scanning`. After the scan, it is the developer's responsibility to make the robot move, and to stop the mapping when done. The developer thus has full control over the mapped area. While the action is running, the robot may autonomously look around to confirm his location. A given environment needs to be mapped once and only once. The result of this mapping can be dumped to a ExplorationMap object. Afterwards, the ExplorationMap object can be used to create a Localize action, that will enable the robot to keep track of his position relatively to the map.
Build a new LocalizeAndMap
Build a new Localize
Struct representing a string with a language.
Action to look at and track a target. The target is represented by a frame, and the robot will look at the origin of that frame. In practice, the action will make the robot move so that the x-axis of the gaze frame gets aligned with the origin of the target frame. The robot will track the target until the run is canceled.
Build a new LookAt
Strategies to look at a target.
A service providing the mapping of the local area.
Used to display map shape on a UI. Scale, x, y and theta can be used to switch from world coordinates relatively to mapFrame to image coordinates in pixels using the following formulas: //! x_map = scale * (cos(theta) * x_img + sin(theta) * y_img) + x y_map = scale * (sin(theta) * x_img - cos(theta) * y_img) + y x_img = 1/scale * (cos(theta) * (x_map - x) + sin(theta) * (y_map-y)) y_img = 1/scale * (sin(theta) * (x_map - x) - cos(theta) * (y_map-y)) Map frame Image pixel coordinates //! ^ y_map in meters + -----> x_img in pixels | | | | +-----> x_map v y_img
Struct representing a node. A Node holds an Object which can be a ResourceNode or a LiteralNode. Every node other than ResourceNode are considered as literal. Literal nodes can handle the following types:
- str,
- LocalizedString,
- float,
- int,
- TimeString,
- DateTimeString,
- DateString.
Policy defining orientation control of a given frame with respect to a target frame.
Path planning strategies to go to a target.
Structure containing a chunk of text intended to be said or listened by the robot.
Object representing a set of phrases. Used to regroup phrases considered as synonyms. Example: "yes", "yep", "indeed", ...
Build a new PhraseSet
PhraseSet Utility class
Enum containing the possible states reached on the pleasure-displeasure scale.
The Power service aggregates information linked to robot power management.
Chatbot that can be used to make the robot able to chat with a human. The interaction will be based on the content given in the QiChatbot topics.
Build a new QiChatbot
Object representing a user defined action to execute synchronously during an utterance in a QiChatbot
Object representing a variable in a QiChat topic.
Session disconnection listener.
Represents a connection to a robot.
Helper to initialize Qi SDK.
Shared thread pool.
Quaternion representation of rotations. See https://en.wikipedia.org/wiki/Quaternion for more information.
All the possible regions
Region utility class.
Additional information on the Priority of a Chatbot Reply
A Chatbot reaction to a human utterance
A requirement creates and holds a Future
, which represents the value once satisfied.
Struct representing a resource node which has a unique URL among the triple database.
The context object gathers together all the handles and tokens that authorize an action to be effectively executed on a robot.
The context factory is a service providing a context.
Robot Lifecycle Callback.
Alternative AutonomousReaction that allows the implementation of its execution by heirs instead of delegating it to a ChatbotReaction.
Alternative ReplyReaction that allows the implementation of its execution by heirs instead of delegating it to a ChatbotReaction.
Action to make the robot say a phrase.
Build a new Say
Enum containing the possible smiling states of the human. This feature is only based on facial expression.
Factory to create Say actions
Default implementation of AutonomousReaction.
Default implementation of ReplyReaction.
A buffer that can be read little by little.
Factory of StreamableBuffer.
Action to take pictures on the robot.
Build a new TakePicture
Timestamped encoded image
Associates a timestamp to an EncodedImageHandle.
Struct representing a dateString as string.
Object representing a topic.
Build a new Topic
The current state of a topic in a Discuss Action
The Touch service provides objects to access and subscribe to sensor data.
Object representing a sensor that detects when the robot is touched.
Description of a touch sensor state.
A homogenous transformation matrix. See http://en.wikipedia.org/wiki/Transformation_matrix for more information. It is represented by a 3D vector and a quaternion.
Build a new Transform
A transform associated with at timestamp.
Struct representing a triple. Subject and predicate are always resources, the object can be a resource or a literal (here encapsulated in a Node).
A generic 3D vector.