ImageRegionsImageRegions

AI Overview😉

  • The potential purpose of this module is to analyze and classify images based on their content, particularly with respect to inappropriate or offensive material such as pornography, violence, and pedophilia. It also appears to be checking for other features such as thumbnails and navigation boosts.
  • This module could impact search results by filtering out or demoting images that contain inappropriate or offensive content, or by promoting images that are deemed safe and suitable for users. This could lead to a safer and more family-friendly search experience, but could also potentially lead to over-filtering or biased results.
  • A website may change things to be more favorable for this function by ensuring that their images are properly labeled and categorized, and by avoiding content that could be deemed inappropriate or offensive. They could also optimize their images to include relevant metadata and keywords, and to provide clear and concise descriptions of the image content. Additionally, websites could consider using content moderation tools and services to help identify and remove inappropriate content.

Interesting Module? Vote 👇

Voting helps other researchers find interesting modules.

Current Votes: 0

GoogleApi.ContentWarehouse.V1.Model.ImageRegionsImageRegions (google_api_content_warehouse v0.4.0)

An image with regions within it. NEXT_ID: 11

Attributes

  • finalPornScore (type: number(), default: nil) - The final_porn_score for the image.
  • finalViolenceScore (type: number(), default: nil) - The final_violence_score for the image.
  • flowOutput (type: GoogleApi.ContentWarehouse.V1.Model.ImageContentFlowProtoProd.t, default: nil) - The output of various features generated by the Flow framework, most importantly data from Starburst (go/starburst).
  • has300kThumb (type: boolean(), default: nil) - True if the image has a 300k thumb.
  • hasNavboost (type: boolean(), default: nil) - True if the image has navboost.
  • isIuInappropriate (type: boolean(), default: nil) - True if the image is iu-inappropriate.
  • pedoScore (type: number(), default: nil) - The pedo_score of the image.
  • precomputedRestricts (type: GoogleApi.ContentWarehouse.V1.Model.PrecomputedRestricts.t, default: nil) - The precomputed restricts for the image.
  • racyScore (type: number(), default: nil) - The racy_score of the image.
  • region (type: list(GoogleApi.ContentWarehouse.V1.Model.ImageRegionsImageRegion.t), default: nil) - The list of regions.

Summary

Types

t()

Functions

decode(value, options)

Unwrap a decoded JSON object into its complex fields.

Types

Link to this type

t()

@type t() :: %GoogleApi.ContentWarehouse.V1.Model.ImageRegionsImageRegions{
  finalPornScore: number() | nil,
  finalViolenceScore: number() | nil,
  flowOutput:
    GoogleApi.ContentWarehouse.V1.Model.ImageContentFlowProtoProd.t() | nil,
  has300kThumb: boolean() | nil,
  hasNavboost: boolean() | nil,
  isIuInappropriate: boolean() | nil,
  pedoScore: number() | nil,
  precomputedRestricts:
    GoogleApi.ContentWarehouse.V1.Model.PrecomputedRestricts.t() | nil,
  racyScore: number() | nil,
  region:
    [GoogleApi.ContentWarehouse.V1.Model.ImageRegionsImageRegion.t()] | nil
}

Functions

Link to this function

decode(value, options)

@spec decode(struct(), keyword()) :: struct()

Unwrap a decoded JSON object into its complex fields.