google_api_vision v0.14.0 GoogleApi.Vision.V1.Model.SafeSearchAnnotation View Source
Set of features pertaining to the image, computed by computer vision methods over safe-search verticals (for example, adult, spoof, medical, violence).
Attributes
adult
(type:String.t
, default:nil
) - Represents the adult content likelihood for the image. Adult content may contain elements such as nudity, pornographic images or cartoons, or sexual activities.adultConfidence
(type:number()
, default:nil
) - Confidence of adult_score. Range [0, 1]. 0 means not confident, 1 means very confident.medical
(type:String.t
, default:nil
) - Likelihood that this is a medical image.medicalConfidence
(type:number()
, default:nil
) - Confidence of medical_score. Range [0, 1]. 0 means not confident, 1 means very confident.nsfwConfidence
(type:number()
, default:nil
) - Confidence of nsfw_score. Range [0, 1]. 0 means not confident, 1 means very confident.racy
(type:String.t
, default:nil
) - Likelihood that the request image contains racy content. Racy content may include (but is not limited to) skimpy or sheer clothing, strategically covered nudity, lewd or provocative poses, or close-ups of sensitive body areas.racyConfidence
(type:number()
, default:nil
) - Confidence of racy_score. Range [0, 1]. 0 means not confident, 1 means very confident.spoof
(type:String.t
, default:nil
) - Spoof likelihood. The likelihood that an modification was made to the image's canonical version to make it appear funny or offensive.spoofConfidence
(type:number()
, default:nil
) - Confidence of spoof_score. Range [0, 1]. 0 means not confident, 1 means very confident.violence
(type:String.t
, default:nil
) - Likelihood that this image contains violent content.violenceConfidence
(type:number()
, default:nil
) - Confidence of violence_score. Range [0, 1]. 0 means not confident, 1 means very confident.
Link to this section Summary
Functions
Unwrap a decoded JSON object into its complex fields.
Link to this section Types
Link to this type
t()
View Sourcet() :: %GoogleApi.Vision.V1.Model.SafeSearchAnnotation{ adult: String.t(), adultConfidence: number(), medical: String.t(), medicalConfidence: number(), nsfwConfidence: number(), racy: String.t(), racyConfidence: number(), spoof: String.t(), spoofConfidence: number(), violence: String.t(), violenceConfidence: number() }
Link to this section Functions
Unwrap a decoded JSON object into its complex fields.