Embody your character, avatar or robot.

Usage:

curl -k https://webapp2.embodydigital.com/behavior/behavior.php --data '{ "utterance" : "hi there"}'

Response will include the behaviors that should be activated at the time related to the words that are spoken. For example, there are two words in the utterance (‘hi there’), the first starts at position 0 and the second at position 2. The from/to indicate the the span of the behavior. The API returns both semantic and syntactic indicators (such as ‘deixis’ which is a pointing behavior, or ‘dir’ which specifies a direction, or specific behavior suggestions, such as ‘gesture_left’, ‘gesture_greeting’, ‘wiggle’ (head movement), ‘gaze_aversion_upright’ (avert your gaze to the upright quadrant then return).

{{“status”:”ok”,”result”:”[

{\”priority\”: \”12\”, \”to\”: \”1\”, \”from\”: \”0\”, \”behavior\”: \”greeting\”},

{\”priority\”: \”14\”, \”to\”: \”2\”, \”from\”: \”1\”, \”behavior\”: \”dir\”},

{\”priority\”: \”13\”, \”to\”: \”2\”, \”from\”: \”1\”, \”behavior\”: \”dir\”},

{\”priority\”: \”12\”, \”to\”: \”2\”, \”from\”: \”1\”, \”behavior\”: \”deixis\”},

{\”priority\”: \”12\”, \”to\”: \”2\”, \”from\”: \”1\”, \”behavior\”: \”deixis\”},

{\”priority\”: \”35\”, \”to\”: \”2\”, \”from\”: \”0\”, \”behavior\”: \”neutral\”},

{\”priority\”: \”3\”, \”to\”: \”2\”, \”from\”: \”0\”, \”behavior\”: \”cogload\”},

{\”priority\”: \”6\”, \”to\”: \”2\”, \”from\”: \”0\”, \”behavior\”: \”first_VP\”},

{\”priority\”: \”6\”, \”to\”: \”2\”, \”from\”: \”0\”, \”behavior\”: \”first_S1\”},

{\”priority\”: \”12\”, \”to\”: \”2\”, \”from\”: \”1\”, \”behavior\”: \”gesture_left\”},

{\”priority\”: \”12\”, \”to\”: \”2\”, \”from\”: \”1\”, \”behavior\”: \”gaze_aversion_left\”},

{\”priority\”: \”12\”, \”to\”: \”1\”, \”from\”: \”0\”, \”behavior\”: \”gesture_greeting\”},

{\”priority\”: \”12\”, \”to\”: \”1\”, \”from\”: \”0\”, \”behavior\”: \”wiggle\”},

{\”priority\”: \”3\”, \”to\”: \”2\”, \”from\”: \”0\”, \”behavior\”: \”gaze_aversion_upright\”},
{“bml”: “…..”}], “reason”: “”}

In addition, the response will contain a Behavior Markup Language (BML) block that can be used in a BML realizer (like SmartBody) to directly create animation.

Emotion tags

Emotion can be added to the tags by specifying one of the following before the words: neutral, happy, sad, surprised, angry, fear, agree, disagree, sarcasm, sad. The neutral tag is the default if no emotion is specified.

curl -k https://webapp2.embodydigital.com/behavior/behavior.php --data '{ "utterance" : "hi there", "emotionalPhrase" : "happy=0:1"}'

curl -k https://webapp2.embodydigital.com/behavior/behavior.php --data '{ "utterance" : "I can't believe that you did that!", "emotionalPhrase" : "angry=3:6"}'

Stress tags

The utterance can also be annotated with stress levels from 0 to 1 (0 is the default). This can be used to stress or emphasize a particular word and the associated behaviors with that word:

Emotion can be added to the tags by specifying one of the following before the words: happy, sad, surprised, angry, fear, agree, disagree, sarcasm, sad:

curl -k https://webapp2.embodydigital.com/behavior/behavior.php --data '{ "utterance" : "I saw it happen right over there!", "stress" : "0 0 0 0 0 0 1"}'

curl -k https://webapp2.embodydigital.com/behavior/behavior.php --data '{ "utterance" : "I saw it happen right over there!",  "stress" : "1 1 0 0 0 0 0"}'

Terms of Use and License

If you are interested in using this API, please contact info@embodydigital.com for the terms of use for this service. In addition, the Behavior API is available as an embeddable library. Please contact us for license details.

Code Sample

Code sample accessing the Behavior API using Unreal and Metahumans can be found in a sample Unreal 4.26 project here:

https://drive.google.com/file/d/1VNYeN2v0NpaT8lFXoXZ4fF62zOlzEZeO/view?usp=sharing

Embody Digital code samples included use the MIT license

Instructions:

  1. Expand the .zip file into a folder onto a Windows machine that has Unreal 4.26 installed
  2. A MetaHuman is not included in the project; create and download a MetaHuman using the EPIC Metahuman Creator
  3. Put the MetaHuman folder into the TestEmbodySDK/Content/ folder
  4. Open the Unreal project
  5. Download the UnrealWebServer Plugin
  6. Drag the MetaHuman into the level
  7. Place the MetaHuman into the position 0,0,33 with z rotation set to -90
  8. Run the project
  9. Use AWS to access the Amazon Polly service, put the service key in TestEmbodySDK/embodysdkdata/ttsapi/ttsapiwithkey.bat
  10. Run TestEmbodySDK/embodysdkdata/ttsapi/ttsapiwithkey.bat
  11. Send a command to make the character speak: curl ‘https://localhost:8004/botbatte’ –data ‘{“commandType” : “speak”, “character” : “character”, “value” : “hello, [happy] I am a metahuman”}’

Dependencies (modified SmartBody library, TTS connector to Amazon Polly service) can also be found here (can be built with Visual Studio 2015 or 2017):

https://drive.google.com/file/d/1TaITvlm4S6pF85k-YD8-jMdoa_GPLUbh/view?usp=sharing

Leave a Reply

Your email address will not be published.