VideoFileSphericalMetadata

AI Overview😉

  • The potential purpose of this module is to analyze and understand the metadata of 360-degree videos, also known as spherical videos. This includes information about the video's projection type, field of view, camera motion, and other technical details.
  • This module could impact search results by allowing Google to better understand the content and quality of 360-degree videos. This could lead to more accurate and relevant search results for users searching for specific types of videos, such as virtual tours or immersive experiences. Additionally, it could enable Google to provide more detailed and informative video snippets or previews in search results.
  • A website may change things to be more favorable for this function by ensuring that their 360-degree videos are properly tagged and annotated with metadata, such as the projection type, field of view, and camera motion. This could involve using standardized metadata formats and providing detailed information about the video's technical specifications. Additionally, websites could optimize their video content and encoding to ensure that it is compatible with Google's analysis and indexing algorithms.

Interesting Module? Vote 👇

Voting helps other researchers find interesting modules.

Current Votes: 0

GoogleApi.ContentWarehouse.V1.Model.VideoFileSphericalMetadata (google_api_content_warehouse v0.4.0)

Globally allowed spherical meta data.

Attributes

  • clampedOptimalFovBounds (type: GoogleApi.ContentWarehouse.V1.Model.VideoFileSphericalMetadataFOVBounds.t, default: nil) - Like above, but with high-pass motion filtering applied and yaw rotation limited to +/- 15-degrees
  • cubemap (type: GoogleApi.ContentWarehouse.V1.Model.VideoFileSphericalMetadataCubemapProjection.t, default: nil) -
  • deprecatedCroppedArea (type: GoogleApi.ContentWarehouse.V1.Model.VideoFileSphericalMetadataCroppedArea.t, default: nil) -
  • deprecatedInitialView (type: GoogleApi.ContentWarehouse.V1.Model.VideoFileSphericalMetadataViewDirection.t, default: nil) - InitialView is from v1 spec, and is more or less equivalent to Pose from v2 spec. Therefore, InitialView found in xml metadata would populate the pose field in this proto.
  • equirect (type: GoogleApi.ContentWarehouse.V1.Model.VideoFileSphericalMetadataEquirectProjection.t, default: nil) -
  • fullPanoHeightPixels (type: integer(), default: nil) -
  • fullPanoWidthPixels (type: integer(), default: nil) - Dimensions of the full video frame.
  • mesh (type: GoogleApi.ContentWarehouse.V1.Model.VideoFileSphericalMetadataMeshProjection.t, default: nil) -
  • metadataSource (type: String.t, default: nil) - Metadata source v2(svhd)
  • optimalFovBounds (type: GoogleApi.ContentWarehouse.V1.Model.VideoFileSphericalMetadataFOVBounds.t, default: nil) - If video contains Wally-sanitized mesh and camera motion metadata (see go/wally-format ), this contains the optimal FOV (smallest FOV that encompass all combinations of input mesh FOV and rotations). This field will only be present if full FfmpegAnalyze is performed.
  • pose (type: GoogleApi.ContentWarehouse.V1.Model.VideoFileSphericalMetadataPose.t, default: nil) -
  • projectionType (type: String.t, default: nil) - Mapping type used to map the sphere to the rectangular video E.g., "equirectangular", http://en.wikipedia.org/wiki/Equirectangular_projection This is kept as string so that we can retain values that are unknown to us.
  • sourceCount (type: integer(), default: nil) - The number of camera sources used to generate this video.
  • spherical (type: boolean(), default: nil) - Whether the video is spherical or not.
  • stereoMode (type: String.t, default: nil) - The stereo mode.
  • stitched (type: boolean(), default: nil) - Whether the video has already been stitched.
  • stitchingSoftware (type: String.t, default: nil) - The stitching software.
  • timestamp (type: String.t, default: nil) - Epoch Timestamp of when the first frame in the video was recorded

Summary

Types

t()

Functions

decode(value, options)

Unwrap a decoded JSON object into its complex fields.

Types

Link to this type

t()

@type t() :: %GoogleApi.ContentWarehouse.V1.Model.VideoFileSphericalMetadata{
  clampedOptimalFovBounds:
    GoogleApi.ContentWarehouse.V1.Model.VideoFileSphericalMetadataFOVBounds.t()
    | nil,
  cubemap:
    GoogleApi.ContentWarehouse.V1.Model.VideoFileSphericalMetadataCubemapProjection.t()
    | nil,
  deprecatedCroppedArea:
    GoogleApi.ContentWarehouse.V1.Model.VideoFileSphericalMetadataCroppedArea.t()
    | nil,
  deprecatedInitialView:
    GoogleApi.ContentWarehouse.V1.Model.VideoFileSphericalMetadataViewDirection.t()
    | nil,
  equirect:
    GoogleApi.ContentWarehouse.V1.Model.VideoFileSphericalMetadataEquirectProjection.t()
    | nil,
  fullPanoHeightPixels: integer() | nil,
  fullPanoWidthPixels: integer() | nil,
  mesh:
    GoogleApi.ContentWarehouse.V1.Model.VideoFileSphericalMetadataMeshProjection.t()
    | nil,
  metadataSource: String.t() | nil,
  optimalFovBounds:
    GoogleApi.ContentWarehouse.V1.Model.VideoFileSphericalMetadataFOVBounds.t()
    | nil,
  pose:
    GoogleApi.ContentWarehouse.V1.Model.VideoFileSphericalMetadataPose.t() | nil,
  projectionType: String.t() | nil,
  sourceCount: integer() | nil,
  spherical: boolean() | nil,
  stereoMode: String.t() | nil,
  stitched: boolean() | nil,
  stitchingSoftware: String.t() | nil,
  timestamp: String.t() | nil
}

Functions

Link to this function

decode(value, options)

@spec decode(struct(), keyword()) :: struct()

Unwrap a decoded JSON object into its complex fields.