SoftBank Robotics documentation What's new in NAOqi 2.5?

ALTactileGesture

NAOqi Sensors & LEDs - Overview | API


What it does

ALTactileGesture is intended to manage tactile gestures on the head sensors.

With this module, you can:

  • Detect tactile gestures performed on the head sensors,
  • Respond to tactile gestures via qi.Signals and ALMemory events,
  • Create new tactile gestures on-the-fly.

How it works

Note

ALTactileGesture utilizes many features of the new qimessaging framework (see qimessaging-python).

Tactile Gesture

On the head of each robot are 3 touch sensors, labeled Front, Middle, and Rear. For further details see: nao Tactile Head or pepp Head tactile sensor.

When you interact with these sensors, you are making a tactile gesture. Tactile gestures can range from simple to complex. An example of a simple tactile gesture would be a single touch of the front head sensor (known as a “SingleFront”), while a more complex gesture example would be one known as “ZoomIn” – whereby you first touch the front and rear head sensors at the same time, then slide both fingers toward the middle sensor and lift up.

A tactile gesture is composed of two elements:

  • a name

    The name of a tactile gesture is either built into the system, or auto-generated based on the sequence. For further details see: Default Tactile Gestures.

  • a sequence

    The sequence of a tactile gesture is a binary representation of activity on the head sensors from moment to moment. This representation takes the form of a list of binary triplets. Each digit of the triplet corresponds to the activity on the Front, Middle and Rear sensors – in that order. For example, the tactile gestures known as “SingleFront” and “ZoomIn” (described previously) are [000, 100, 000] and [000, 101, 010, 000], respectively. Here, 0 means no activity on the sensor, while 1 means activity.

    Note

    All sequences must start with “000”, as “000” means that there is no activity on any of the head sensors – which is always the case when you begin a gesture.

    All sequences do not have to end with “000”, however. If a sequence ends in a triplet other than “000”, it indicates that it is a “hold” gesture (i.e. you are holding down a particular triplet). For example, as stated before, the “SingleFront” gesture is represented as [000, 100, 000]. Contrast this with the “SingleFrontHold” gesture, which is represented as [000, 100].

    Note

    Once a particular triplet has been held the gesture is complete (and matched if possible). Any input following that is recognized as part of a new gesture. As a consequence something like “SingleFront SingleFrontHold SingleFront” isn’t a valid gesture.

Evaluation parameters

The evaluation of an inputted tactile gesture depends on 3 key timing intervals. These are:

  • settle time

    Settling time is the length of time between the initial sensor change (i.e. the moment you first make contact with any of the 3 head sensors) and when the full sensor read (of all 3 head sensors) occurs.

  • hold time

    Hold time is the length of time between the full sensor read and when the currently active pattern is labeled as a hold. Furthermore, this is the length of time between each re-firing of the signal/event for a still-held gesture.

  • sequence time

    Sequence time is the length of time between inputted patterns. After every sensor read, if a tactile event was not registered on any of the 3 head sensors it will be evaluated as a gesture. If a tactile event was registered on any of the 3 head sensors, then it will be viewed as a part of an on-going tactile gesture.

Default Tactile Gestures

By default, there are a number of tactile gestures built into the system. They are:

  • ‘SingleFront’: [000, 100, 000]
  • ‘SingleMiddle’: [000, 010, 000]
  • ‘SingleRear’: [000, 001, 000]
  • ‘DoubleFront’: [000, 100, 000, 100, 000]
  • ‘DoubleMiddle’: [000, 010, 000, 010, 000]
  • ‘DoubleRear’: [000, 001, 000, 001, 000]
  • ‘SingleTap’: [000, 111, 000]
  • ‘DoubleTap’: [000, 111, 000, 111, 000]
  • ‘CaressFtoR’: [000, 100, 010, 001, 000]
  • ‘CaressRtoF’: [000, 001, 010, 100, 000]
  • ‘ZoomIn’: [000, 101, 010, 000]
  • ‘ZoomOut’: [000, 010, 101, 000]
  • ‘TheClaw’: [000, 101, 000]
  • ‘SingleFrontHold’: [000, 100]
  • ‘SingleMiddleHold’: [000, 010]
  • ‘SingleRearHold’: [000, 001]
  • ‘SingleTapHold’: [000, 111]
  • ‘TheClawHold’: [000, 101]

Signals

The recommended way to respond to tactile gestures is using the onGesture signal. Whenever a tactile gesture is recognized, the onGesture signal is fired with the name of the tactile gesture that was matched as its value. With this pattern, it is convenient to attach a single “gestureHandler” function to the signal and handle the processing of the matched gestures there.

For compatibility, a corresponding ALMemory event is also raised, ALTactileGesture/Gesture.

When carrying out a hold gesture, if the gesture is continuously held, the associated onGesture signal will continuously fire (after every hold period has elapsed) until release. Often it is desirable to only react to the initial onGesture signal associated to a hold gesture and ignore subsequent firings. This can be accomplished easily via the onRelease signal. When a tactile gesture has completed (i.e. the user has released from carrying out the tactile gesture), the onRelease signal will fire.

Again, for compatibility, a corresponding ALMemory event is also raised, ALTactileGesture/Release.

Getting started

The following Python script shows example usages of ALTactileGesture.

To run:

python altactilegesture_example.py --qi-url=tcp://[Robot IP]:[PORT]

altactilegesture_example.py

import qi
import argparse
import sys

"""Example: ALTactileGesture Example Application"""


class ReactToTactileGesture():
    def __init__(self, app):
        """
        ALTactileGesture example application.
        """
        app.start()
        session = app.session

        # Connect services
        self.tts = session.service('ALTextToSpeech')
        self.tg = session.service('ALTactileGesture')

        # Connect tactile gesture handler to onGesture signal
        self.s1 = self.tg.onGesture.connect(self.tactile_gesture_handler)

        # Connect tactile gesture release handler to onRelease signal
        self.s2 = self.tg.onRelease.connect(self.tactile_gesture_release_handler)

        # Using getGesture(), get the name of the gesture we're labeling "DoubleTap"
        doubleTap = ['000', '111', '000', '111', '000']
        self.DoubleTap = self.tg.getGesture(doubleTap)

        # Create new gestures
        self.add_gestures()

        # Boolean 'lock' useful for responding to 'hold' gestures in a more controlled manner
        self.gesture_hold_lock = False

        print 'ALTactileGesture Example Application :'
        print 'Please touch the robot head sensors, for example the front one.'

    def add_gestures(self):
        """
        Add a couple new gestures into ALTactileGesture.
        """

        # Create a new gesture "TripleTap"
        try:
            self.TripleTap = None
            tripleTap = ['000', '111', '000', '111', '000', '111', '000']
            self.TripleTap = self.tg.createGesture(tripleTap)
        except RuntimeError as e:
            print e

        # Using getSequence(), create "QuadrupleTap" by building off of "TripleTap"
        try:
            tt_seq = self.tg.getSequence(self.TripleTap)
            if tt_seq:
                tt_seq.extend(['111', '000'])
                self.QuadrupleTap = self.tg.createGesture(tt_seq)
        except RuntimeError as e:
            print e

    def tactile_gesture_handler(self, value):
        """
        Given a tactile gesture, say the gesture if we're listening for it.
        """
        # A default gesture...
        if value == 'SingleFront':
            self.tts.say(value)

        # Another default gesture, via getGesture() ...
        if value == self.DoubleTap:
            self.tts.say('DoubleTap')

        # New gesture
        if value == self.TripleTap:
            self.tts.say('TripleTap')

        # New gesture, using getSequence() to build from ...
        if value == self.QuadrupleTap:
            self.tts.say('QuadrupleTap')

        # Hold gesture, repeats every 'hold period'
        if value == 'SingleFrontHold':
            self.tts.say(value)

        # Hold gesture, only responds to first firing
        if not self.gesture_hold_lock and value == 'SingleRearHold':
            self.gesture_hold_lock = True
            self.tts.say(value)

    def tactile_gesture_release_handler(self):
        """
        Enables 'locking out' of multiple 'hold gesture' signal responses
        """
        self.gesture_hold_lock = False

    def clean_up(self):
        """
        Disconnect tactile gesture handler from signal
        """
        self.tg.onGesture.disconnect(self.s1)
        self.tg.onRelease.disconnect(self.s2)


if __name__ == "__main__":
    parser = argparse.ArgumentParser()
    parser.add_argument("--ip", type=str, default="127.0.0.1",
                        help="Robot IP address. On robot or Local Naoqi: use '127.0.0.1'.")
    parser.add_argument("--port", type=int, default=9559,
                        help="Naoqi port number")

    args = parser.parse_args()
    try:
        # Initialize qi framework.
        connection_url = "tcp://" + args.ip + ":" + str(args.port)
        app = qi.Application(["ReactToTouch", "--qi-url=" + connection_url])
    except RuntimeError:
        print ("Can't connect to Naoqi at ip \"" + args.ip + "\" on port " + str(args.port) +".\n"
               "Please check your script arguments. Run with -h option for help.")
        sys.exit(1)
    react_to_tactile_gesture = ReactToTactileGesture(app)
    app.run() # will exit when the connection is over.
    react_to_tactile_gesture.clean_up()