Datahose - Read Events

Creates and reads real time messages / events streams.
The datahose API provides messages and events from all conversations in the pod, even the ones the service user is not part of. The types of events surfaced can be found in the Real Time Events list.
Not yet available in production. Dependency on Agent 22.6+.
Add-on: The Datahose API is an add-on to the Symphony Services, and is subject to additional charges. Prior to using Datahose in your Symphony environment(s), you will need to enter into a specific contract. Please reach out to [email protected] to discuss the offering, its pricing or for any further information.
post
https://youragentURL.symphony.com/agent
/v5/events/read
Read Real Time Events from an event stream (aka datafeed)

_Available on Agent 22.5.1 and above.This endpoint provides messages and events from all conversations that the user is in or events from the whole pod depending on the "type" field value. When "type":"fanout" is provided in the body, then only events from streams the accountbelongs to are returned. Otherwise, if "type": "datahose" is provided in the body, then events returned are not limited to the streams user belongs to. In this case, at least one event type must be provided in the "filters" field. In case you are using a datahose feed and retrieving SOCIALMESSAGE events, ceservice account must be properly configured in the Agent.The types of events returned can be found in the Real Time Events list (see definition on top of the file). The ackId sent as parameter can be empty for the first call. In the response an ackId will be sent back and it can be used for the next call: in this way you acknowledge that you have received the events that came with that ackId. If you have several instances of the same bot, they must share the same feed so that events are spread across all bot instances. To do so, you must: share the same service account provide the same "tag" and same "filters" values

Parameters
Header
sessionToken*
string
Session authentication token.
keyManagerToken*
string
Key Manager authentication token.
Body
Example
Schema
{
"type": "string",
"tag": "string",
"eventTypes": [
"string"
],
"ackId": "string",
"updatePresence": true
}
Responses
200: OK
Datafeed successfully read.
400: Bad Request
Bad request.
401: Unauthorized
Unauthorized.
403: Forbidden
Forbidden.
500: Internal Server Error
Internal server error.

Datahose is not yet available in production.
Entitlements: The service account user needs to have both the Can read from datahose feeds and Can create datahose feeds entitlements enabled to call this endpoint.

Configuration

The credentials of the Content Export service need to be setup in the Agent configuration for datahose to work.
The description of the configuration fields for the Content Export service is available in the Agent configuration guide (look for agent.privatekey.ceservice and agent.certificate.ceservice).
To check that the Content Export is correctly setup, you can test the Agent health check extended endpoint (/agent /v3/health/extended).
The value of users.ceservice.status should be "UP", see example below.
{
"services": {
"datafeed": {
"status": "UP",
"version": "2.11.2"
},
"key_manager": {
"status": "UP",
"version": "20.16.3"
},
"pod": {
"status": "UP",
"version": "20.16.74-216-100b3be"
}
},
"status": "UP",
"users": {
"agentservice": {
"authType": "RSA",
"status": "UP"
},
"ceservice": {
"authType": "RSA",
"status": "UP"
}
},
"version": "23.11.1-716"
}

Using ackId

The Datahose - Read Events endpoint returns the Real Time Events happening in the pod, either since the time the datahose feed was created or since the previous feed was read by the bot. The ackId has an essential role in retrieving the right events for the bot.
The ackId should indeed be null or empty for the first call. Then, for subsequent requests, the ackId from the previous payload should be reused to confirm the reading of previous events already retrieved by the bot. Please note that you can very easily access this API via our BDKs in Java and Python.
If a batch of messages is not confirmed by sending the ackId, the messages that are there will be returned in the subsequent readings and may blend into the newer messages.

Fair use policy

Datahose API is subject to a fair use policy of 5 active feeds.
If your integration or workflow require more than 5 feeds active at the same time, please contact Symphony.
Last modified 14h ago