google_api_dialogflow v0.6.0 API Reference
Modules
API calls for all endpoints tagged Projects
.
Handle Tesla connections for GoogleApi.Dialogflow.V2.
Helper functions for deserializing responses into models.
Represents a conversational agent.
The request message for EntityTypes.BatchCreateEntities.
The request message for EntityTypes.BatchDeleteEntities.
The request message for EntityTypes.BatchDeleteEntityTypes.
The request message for Intents.BatchDeleteIntents.
The request message for EntityTypes.BatchUpdateEntities.
The request message for EntityTypes.BatchUpdateEntityTypes.
The response message for EntityTypes.BatchUpdateEntityTypes.
The request message for Intents.BatchUpdateIntents.
The response message for Intents.BatchUpdateIntents.
Represents a context.
The request to detect user's intent.
The message returned from the DetectIntent method.
Represents an entity type. Entity types serve as a tool for extracting parameter values from natural language queries.
This message is a wrapper around a collection of entity types.
An entity entry for an associated entity type.
Events allow for matching intents by event name instead of the natural language input. For instance, input `<event: { name: "welcome_event", parameters: { name: "Sam" } }>` can trigger a personalized welcome response. The parameter `name` may be used by the agent in the response: `"Hello #welcome_event.name! What can I do for you today?"`.
The request message for Agents.ExportAgent.
The response message for Agents.ExportAgent.
The request message for Agents.ImportAgent.
Instructs the speech recognizer how to process the audio content.
Represents an intent. Intents convert a number of user expressions or patterns into an action. An action is an extraction of a user command or sentence semantics.
This message is a wrapper around a collection of intents.
Represents a single followup intent in the chain.
Corresponds to the `Response` field in the Dialogflow console.
The basic card message. Useful for displaying information.
The button object that appears at the bottom of a card.
Opens the given URI.
The card response message.
Optional. Contains information about a button.
The card for presenting a carousel of options to select from.
An item in the carousel.
The image response message.
The suggestion chip message that allows the user to jump out to the app or website associated with this agent.
The card for presenting a list of options to select from.
An item in the list.
The quick replies response message.
Additional info about the select item for when it is triggered in a dialog.
The simple response message containing speech or text.
The collection of simple response candidates. This message in `QueryResult.fulfillment_messages` and `WebhookResponse.fulfillment_messages` should contain only one `SimpleResponse`.
The suggestion chip message that the user can tap to quickly post a reply to the conversation.
The collection of suggestions.
The text response message.
Represents intent parameters.
Represents an example that the agent is trained on.
Represents a part of a training phrase.
The response message for Contexts.ListContexts.
The response message for EntityTypes.ListEntityTypes.
The response message for Intents.ListIntents.
The response message for SessionEntityTypes.ListSessionEntityTypes.
Represents the contents of the original request that was passed to the `[Streaming]DetectIntent` call.
Instructs the speech synthesizer how to generate the output audio content.
Represents the query input. It can contain either: 1. An audio config which instructs the speech recognizer how to process the speech audio. 2. A conversational query in the form of text,. 3. An event that specifies which intent to trigger.
Represents the parameters of the conversational query.
Represents the result of conversational query or event processing.
The request message for Agents.RestoreAgent.
The response message for Agents.SearchAgents.
The sentiment, such as positive/negative feeling or association, for a unit of analysis, such as the query text.
Configures the types of sentiment analysis to perform.
The result of sentiment analysis as configured by `sentiment_analysis_request_config`.
Represents a session entity type. Extends or replaces a developer entity type at the user session level (we refer to the entity types defined at the agent level as "developer entity types"). Note: session entity types apply to all queries, regardless of the language.
Configuration of how speech should be synthesized.
Represents the natural language text to be processed.
The request message for Agents.TrainAgent.
Description of which voice to use for speech synthesis.
The request message for a webhook call.
The response message for a webhook call.
The response message for EntityTypes.BatchUpdateEntityTypes.
The response message for Intents.BatchUpdateIntents.
Represents a context.
Represents a notification sent to Cloud Pub/Sub subscribers for conversation lifecycle events.
Represents an entity type. Entity types serve as a tool for extracting parameter values from natural language queries.
An entity entry for an associated entity type.
Events allow for matching intents by event name instead of the natural language input. For instance, input `<event: { name: "welcome_event", parameters: { name: "Sam" } }>` can trigger a personalized welcome response. The parameter `name` may be used by the agent in the response: `"Hello #welcome_event.name! What can I do for you today?"`.
The response message for Agents.ExportAgent.
Represents a notification sent to Cloud Pub/Sub subscribers for agent assistant events in a specific conversation.
Represents an intent. Intents convert a number of user expressions or patterns into an action. An action is an extraction of a user command or sentence semantics.
Represents a single followup intent in the chain.
Corresponds to the `Response` field in the Dialogflow console.
The basic card message. Useful for displaying information.
The button object that appears at the bottom of a card.
Opens the given URI.
The card response message.
Optional. Contains information about a button.
The card for presenting a carousel of options to select from.
An item in the carousel.
The image response message.
The suggestion chip message that allows the user to jump out to the app or website associated with this agent.
The card for presenting a list of options to select from.
An item in the list.
The quick replies response message.
Additional info about the select item for when it is triggered in a dialog.
The simple response message containing speech or text.
The collection of simple response candidates. This message in `QueryResult.fulfillment_messages` and `WebhookResponse.fulfillment_messages` should contain only one `SimpleResponse`.
The suggestion chip message that the user can tap to quickly post a reply to the conversation.
The collection of suggestions.
Plays audio from a file in Telephony Gateway.
Synthesizes speech and plays back the synthesized audio to the caller in Telephony Gateway. Telephony Gateway takes the synthesizer settings from `DetectIntentResponse.output_audio_config` which can either be set at request-level or can come from the agent-level synthesizer config.
Transfers the call in Telephony Gateway.
The text response message.
Represents intent parameters.
Represents an example that the agent is trained on.
Represents a part of a training phrase.
Represents the result of querying a Knowledge base.
An answer from Knowledge Connector.
Metadata in google::longrunning::Operation for Knowledge operations.
Represents the contents of the original request that was passed to the `[Streaming]DetectIntent` call.
Represents the result of conversational query or event processing.
The sentiment, such as positive/negative feeling or association, for a unit of analysis, such as the query text.
The result of sentiment analysis as configured by `sentiment_analysis_request_config`.
The request message for a webhook call.
The response message for a webhook call.
This resource represents a long-running operation that is the result of a network API call.
A generic empty message that you can re-use to avoid defining duplicated empty messages in your APIs. A typical example is to use it as the request or the response type of an API method. For instance: service Foo { rpc Bar(google.protobuf.Empty) returns (google.protobuf.Empty); } The JSON representation for `Empty` is empty JSON object `{}`.
The `Status` type defines a logical error model that is suitable for different programming environments, including REST APIs and RPC APIs. It is used by gRPC. The error model is designed to be: - Simple to use and understand for most users - Flexible enough to meet unexpected needs # Overview The `Status` message contains three pieces of data: error code, error message, and error details. The error code should be an enum value of google.rpc.Code, but it may accept additional error codes if needed. The error message should be a developer-facing English message that helps developers understand and resolve the error. If a localized user-facing error message is needed, put the localized message in the error details or localize it in the client. The optional error details may contain arbitrary information about the error. There is a predefined set of error detail types in the package `google.rpc` that can be used for common error conditions. # Language mapping The `Status` message is the logical representation of the error model, but it is not necessarily the actual wire format. When the `Status` message is exposed in different client libraries and different wire protocols, it can be mapped differently. For example, it will likely be mapped to some exceptions in Java, but more likely mapped to some error codes in C. # Other uses The error model and the `Status` message can be used in a variety of environments, either with or without APIs, to provide a consistent developer experience across different environments. Example uses of this error model include: - Partial errors. If a service needs to return partial errors to the client, it may embed the `Status` in the normal response to indicate the partial errors. - Workflow errors. A typical workflow has multiple steps. Each step may have a `Status` message for error reporting. - Batch operations. If a client uses batch request and batch response, the `Status` message should be used directly inside batch response, one for each error sub-response. - Asynchronous operations. If an API call embeds asynchronous operation results in its response, the status of those operations should be represented directly using the `Status` message. - Logging. If some API errors are stored in logs, the message `Status` could be used directly after any stripping needed for security/privacy reasons.
An object representing a latitude/longitude pair. This is expressed as a pair of doubles representing degrees latitude and degrees longitude. Unless specified otherwise, this must conform to the <a href="http://www.unoosa.org/pdf/icg/2012/template/WGS_84.pdf">WGS84 standard</a>. Values must be within normalized ranges.
Helper functions for building Tesla requests.