google_api_vision v0.2.0 GoogleApi.Vision.V1.Model.SafeSearchAnnotation View Source

Set of features pertaining to the image, computed by computer vision methods over safe-search verticals (for example, adult, spoof, medical, violence).

Attributes

  • adult (String.t): Represents the adult content likelihood for the image. Adult content may contain elements such as nudity, pornographic images or cartoons, or sexual activities. Defaults to: null.

    • Enum - one of [UNKNOWN, VERY_UNLIKELY, UNLIKELY, POSSIBLE, LIKELY, VERY_LIKELY]
  • medical (String.t): Likelihood that this is a medical image. Defaults to: null.

    • Enum - one of [UNKNOWN, VERY_UNLIKELY, UNLIKELY, POSSIBLE, LIKELY, VERY_LIKELY]
  • racy (String.t): Likelihood that the request image contains racy content. Racy content may include (but is not limited to) skimpy or sheer clothing, strategically covered nudity, lewd or provocative poses, or close-ups of sensitive body areas. Defaults to: null.

    • Enum - one of [UNKNOWN, VERY_UNLIKELY, UNLIKELY, POSSIBLE, LIKELY, VERY_LIKELY]
  • spoof (String.t): Spoof likelihood. The likelihood that an modification was made to the image's canonical version to make it appear funny or offensive. Defaults to: null.

    • Enum - one of [UNKNOWN, VERY_UNLIKELY, UNLIKELY, POSSIBLE, LIKELY, VERY_LIKELY]
  • violence (String.t): Likelihood that this image contains violent content. Defaults to: null.

    • Enum - one of [UNKNOWN, VERY_UNLIKELY, UNLIKELY, POSSIBLE, LIKELY, VERY_LIKELY]

Link to this section Summary

Functions

Unwrap a decoded JSON object into its complex fields

Link to this section Types

Link to this type t() View Source
t() :: %GoogleApi.Vision.V1.Model.SafeSearchAnnotation{
  adult: any(),
  medical: any(),
  racy: any(),
  spoof: any(),
  violence: any()
}

Link to this section Functions

Link to this function decode(value, options) View Source
decode(struct(), keyword()) :: struct()

Unwrap a decoded JSON object into its complex fields.