View Source GoogleApi.Vision.V1.Model.GoogleCloudVisionV1p4beta1SafeSearchAnnotation (google_api_vision v0.25.0)

Set of features pertaining to the image, computed by computer vision methods over safe-search verticals (for example, adult, spoof, medical, violence).

Attributes

  • adult (type: String.t, default: nil) - Represents the adult content likelihood for the image. Adult content may contain elements such as nudity, pornographic images or cartoons, or sexual activities.
  • medical (type: String.t, default: nil) - Likelihood that this is a medical image.
  • racy (type: String.t, default: nil) - Likelihood that the request image contains racy content. Racy content may include (but is not limited to) skimpy or sheer clothing, strategically covered nudity, lewd or provocative poses, or close-ups of sensitive body areas.
  • spoof (type: String.t, default: nil) - Spoof likelihood. The likelihood that an modification was made to the image's canonical version to make it appear funny or offensive.
  • violence (type: String.t, default: nil) - Likelihood that this image contains violent content. Violent content may include death, serious harm, or injury to individuals or groups of individuals.

Summary

Functions

Unwrap a decoded JSON object into its complex fields.

Types

@type t() ::
  %GoogleApi.Vision.V1.Model.GoogleCloudVisionV1p4beta1SafeSearchAnnotation{
    adult: String.t() | nil,
    medical: String.t() | nil,
    racy: String.t() | nil,
    spoof: String.t() | nil,
    violence: String.t() | nil
  }

Functions

@spec decode(struct(), keyword()) :: struct()

Unwrap a decoded JSON object into its complex fields.