TABATABAI ALI (US)
SONY CORP AMERICA (US)
WO2023172509A1 | 2023-09-14 |
US198762633800P |
DANILLO B GRAZIOSI (SONY) ET AL: "[VDMC-NEW] Proposed Atlas Syntax for V-DMC", no. m60984, 19 October 2022 (2022-10-19), XP030305376, Retrieved from the Internet
PATENT Atty. Docket No. SYP349901WO02/SONY-76300WO C L A I M S What is claimed is: 1. A method programmed in a non-transitory memory of a device comprising: implementing atlas mapping processing; performing connectivity processing to divide triangles to generate vertex information; and performing vertex position processing to adjust positions of the vertices in the vertex information. 2. The method of claim 1 further comprising: implementing a displacement sub-bitstream to include displacement information, wherein the displacement information is utilized by the vertex position processing. 3. The method of claim 2 further comprising reconstructing a mesh based on the displaced vertices and the connectivity information. 4. The method of claim 1 wherein atlas mapping processing includes: receiving patch identification information and mapping function parameters; and generating (u,v) coordinates based on the patch identification and mapping function parameters. 5. The method of claim 1 further comprising utilizing one or more flags to bypass one or more V-DMC methods and to use a texture attribute map video as a V3C attribute component video. 6. The method of claim 1 wherein implementing atlas mapping processing occurs before performing connectivity processing and vertex position processing. PATENT Atty. Docket No. SYP349901WO02/SONY-76300WO 7. The method of claim 1 wherein implementing atlas mapping processing occurs after performing connectivity processing and vertex position processing. 8. An apparatus comprising: a non-transitory memory for storing an application, the application for: implementing atlas mapping processing; performing connectivity processing to divide triangles to generate vertex information; and performing vertex position processing to adjust positions of the vertices in the vertex information; and a processor coupled to the memory, the processor configured for processing the application. 9. The apparatus of claim 8 wherein the application is further for: implementing a displacement sub-bitstream to include displacement information, wherein the displacement information is utilized by the vertex position processing. 10. The apparatus of claim 9 wherein the application is further for reconstructing a mesh based on the displaced vertices and the connectivity information. 11. The apparatus of claim 8 wherein atlas mapping processing includes: receiving patch identification information and mapping function parameters; and generating (u,v) coordinates based on the patch identification and mapping function parameters. PATENT Atty. Docket No. SYP349901WO02/SONY-76300WO 12. The apparatus of claim 8 wherein the application is further for utilizing one or more flags to bypass one or more V-DMC methods and to use a texture attribute map video as a V3C attribute component video. 13. The apparatus of claim 8 wherein implementing atlas mapping processing occurs before performing connectivity processing and vertex position processing. 14. The apparatus of claim 8 wherein implementing atlas mapping processing occurs after performing connectivity processing and vertex position processing. 15. A system comprising: an encoder configured for encoding a 3D mesh to generate patch identification information and mapping function parameters; and a decoder configured for: implementing atlas mapping processing on the patch identification information and the mapping function parameters; performing connectivity processing to divide triangles to generate vertex information; and performing vertex position processing to adjust positions of the vertices in the vertex information. 16. The system of claim 15 wherein the decoder is further for: implementing a displacement sub-bitstream to include displacement information, wherein the displacement information is utilized by the vertex position processing. 17. The system of claim 16 wherein the decoder is further for reconstructing a mesh based on the displaced vertices and the connectivity information. PATENT Atty. Docket No. SYP349901WO02/SONY-76300WO 18. The system of claim 15 wherein atlas mapping processing includes: receiving patch identification information and mapping function parameters; and generating (u,v) coordinates based on the patch identification and mapping function parameters. 19. The system of claim 15 wherein the decoder is further for utilizing one or more flags to bypass one or more V-DMC methods and to use a texture attribute map video as a V3C attribute component video. 20. The system of claim 15 wherein implementing atlas mapping processing occurs before performing connectivity processing and vertex position processing. 21. The system of claim 15 wherein implementing atlas mapping processing occurs after performing connectivity processing and vertex position processing. |
PATENT Atty. Docket No. SYP349901WO02/SONY-76300WO asps_vmc_ext_patch_mapping_method Description 0 all the triangles in the corresponding submesh are associated with the current patch 1 pair of facegroupIds and patchIds are explicitly signaled in the base mesh 2 pair of facegroupIds and patchIds are explicitly signaled in the mesh patch data unit 3...256 RESERVED asps_vmc_ext_patch_mesh_data_enable_flag equal to 1 specifies that the mesh information (i.e, number of triangles, number of vertices, 3D bounding box) of a patch with index i in a frame with index j will be indicated in the bitstream for each patch data unit. If asps_vmc_ext_patch_mesh_data_enabled_flag is equal to 0 then the mesh information shall be obtained by processing the information in the base mesh data unit. atlas_map_processing_information( ) { Descriptor ampi_projection_enabled_flag u(1) if( ampi_projection_enabled_flag ) ampi_atlas_mapping_method ue(v) if( ampi_atlas_mapping_method == 0 ) { ampi_ortho_atlas_gutter ue(v) ampi_ortho_atlas_width_scale fl(64) ampi_ortho_atlas_height_scale fl(64) } } } PATENT Atty. Docket No. SYP349901WO02/SONY-76300WO ampi_projection_enabled_flag indicates that the 2D locations where attributes are projected are explicitly signaled in the mesh patch data units. Therefore, the projection id and orientation index discussed previously can be also used. ampi_atlas_mapping_method indicates the type of atlas mapping method used. When the signaled ampi_atlas_mapping_method is equal to 0, it indicates that the 2d (u, v) locations of orthographic projection of 3d vertices are derived on the decoder side and not signaled within the mesh bitstream. ampi_atlas_mapping_method Name of atlas mapping method 0 orthoAtlas ampi_ortho_atlas_gutter: A safeguard space between patches during the patch packing process when using "orthoAtlas". ampi_ortho_atlas_lod_width_scale: width scaling factor used for packing the patches in all frames in the sequence. If not present, 1.0 should be used. ampi_ortho_atlas_height_scale: height scaling factor used for packing the patches in all frames in the sequence. If not present, 1.0 should be used. connectivity_processing_information( ) { Descriptor cpi_connectivity_method u(3) if ( cpi_connectivity_method == SUBDIVISION ) { cpi_subdivision_method u(3) cpi_subdivision_iteration_count u(8) } cpi_tjunction_removing_method ue(v) } PATENT Atty. Docket No. SYP349901WO02/SONY-76300WO cpi_connectivity_method indicates the type of connectivity method used. cpi_connectivity_method Name of connectivity method 0 SUBDIVISION cpi_subdivision_method indicates the subdivision method. cpi_subdivision_method Name of subdivision method 0 MIDPOINT cpi_subdivision_iteration_count indicates the number of times the subdivision needs to be applied recursively. cpi_tjunction_removing_method indicates the method to remove t-junctions generated by different subdivision methods or by different subdivision iterations of two triangles sharing an edge. displacement_processing_information( cFlg, tFlg, peFlg, poFlg, ltpIndex) { Descriptor if( cFlg ) dppi_displacement_coordinate_system[ ltpIndex ] u(1) if( tFlg ) { dppi_transform_index[ ltpIndex ][ 0 ] u(3) if ( vppi_transform_index[ ltpIndex ][ 0 ] == LINEAR_LIFTING && peFlg ) vmc_lifting_transform_parameters( 0, lptIndex ) } if( poFg ) { for( i = 1; i < asps_vmc_ext_num_displacement_video; i++ ){ PATENT Atty. Docket No. SYP349901WO02/SONY-76300WO dppi_transform_index[ ltpIndex ][ i ] u(3) if(dppi_transform_index[ ltpIndex ][ i ] == LINEAR_LIFTING ) { dppi_extension_transform_parameters_ present_flag[ ltpIndex ][ i ] u(1) if( dppi_extension_transform_parameters_ present_flag[ ltpIndex ][ i ] ) vmc_lifting_transform_ parameters( i, lptIndex ) } } } } dppi_displacement_coordinate_system[ ltpIndex ] equals to 0 indicates that a gloal coordinate system will be used. dppi_displacement_coordinate_system[ ltpIndex ] equals to 0 indicates that a conversion to a local coordinate system will be used. dp4pi_transform_index[ lptIndex ][ i ] indicates the transform applied to the displacement video of index i. The transform index can indicate any transform is not applied. When the transform is "LINEAR_LIFTING", the necessary parameters are signaled as "vmc_lifting_transform_parameters". vppi_displacement_transform_index Name of transform method 0 NONE 1 LINEAR_LIFTING PATENT Atty. Docket No. SYP349901WO02/SONY-76300WO dppi_extension_transform_parameters_present_flag[ lptIndex ][ i ] equals to 1 specifies that the vmc_lifting_transform_parameters( ) syntax structure for displacement video with index i is present. dppi_extension_transform_parameters_present_flag equals to 0 specifies that the vmc_lifting_transform_parameters( ) syntax structure is not present. vmc_lifting_transform_parameters( dispIndex, ltpIndex ){ Descriptor vmc_transform_lifting_skip_update[ dispIndex][ ltpIndex ] u(1) for( i = 0; i < 3; i++ ) { vmc_transform_lifting_quantization_ parameters[ dispIndex ][ ltpIndex ][ i ] ue(v) vmc_transform_log2_lifting_lod_ inverseScale[ dispIndex ][ ltpIndex ][ i ] ue(v) } vmc_transform_log2_lifting_update_weight[ dispIndex ][ ltpIndex ] ue(v) vmc_transform_log2_lifting_prediction_weight[ dispIndex ][ ltpIndex ] ue(v) } ltpIndex indicates the system level, whereby level 0 is for the whole sequence, level 1 for the frame, and level 2 for the patch. Atlas Frame Parameter set extension afps_vmc_extension( ) { Descriptor afps_vmc_ext_single_submesh_in_frame_flag u(1) if( !afps_vmc_ext_single_submesh_in_frame_flag ) { afps_num_submesh ue(v) for( i = 0; i < afps_num_submesh; i++ ) { afps_submesh_id[ i ] u(v) PATENT Atty. Docket No. SYP349901WO02/SONY-76300WO TileIDToSubmeshID[ afti_tile_id[ i ] ] = afps_submesh_id[ i ] } } afps_vmc_ext_overriden_flag u(1) if( afps_vmc_ext_overriden_flag ) { afps_vmc_ext_connectivity_enabled_flag u(1) if(afps_vmc_ext_connectivity_enable_flag) connectivity_processing_information() afps_vmc_ext_displacement_enabled_flag u(1) if( afps_vmc_ext_displacement_enabled_flag ) { afps_vmc_ext_displacement_coordinate_system_ enabled_flag u(1) cFlag = afps_vmc_ext_displacement_coordinate_system_ enabled_flag afps_vmc_ext_transform_index_enabled_flag u(1) tFlag = afps_vmc_ext_transform_index_enabled_flag afps_vmc_ext_transform_parameters_enabled_flag u(1) peFlag = afps_vmc_ext_transform_parameters_ enabled_flag afps_vmc_ext_attribute_parameter_override_flag u(1) poFlag = afps_vmc_ext_attribute_parameter_override_flag displacement_processing_ information( cFlag, tFlag, peFlag, poFlag, 1 ) } } afps_vmc_ext_single_vertex_property_tile_in_frame_flag u(1) PATENT Atty. Docket No. SYP349901WO02/SONY-76300WO if( !afps_vmc_ext_single_vertex_property_tile_in_frame_flag ) for( i = 0; i < vps_vmc_ext_vertex_property_ video_count[ atlasId ]; i++ ) { afps_ext_vmc_vertex_property_tile_information( i ) } } afps_num_submesh indicates number of sub-meshes in a mesh. In the current high-level syntax, it is assumed to be equal to afti_num_tiles_in_atlas_frame_minus1 + 1. afps_submesh_id[ i ] indicates the ID associated with the sub-mesh with index 1. afps_vmc_ext_single_submesh_in_frame_flag indicates there is only one sub-mesh for the mesh frame. afps_vmc_ext_overriden_flag when true, the subdivision method, displacement coordinate system, transform index, transform parameters, and attribute transform parameters override the ones signaled in asps_vmc_extension(). afps_vmc_ext_single_attribute_tile_in_frame_flag indicates there is only one tile for each attribute signaled in the video streams. Note that the attribute tile information specified previously is able to use the atlas_frame_tile_information in the case of mapped attributes. afps_ext_vmc_vertex_property_tile_information( vpID ) { Descriptor afps_vmc_ext_vertex_property_ti_uniform_ partition_spacing_flag[ vpID ] u(1) if( afps_vmc_ext_vertex_property_ti_uniform_partition_ spacing_flag[ vpID ] ) { afps_vmc_ext_vertex_property_ti_partition_cols_ width_minus1[ vpID ] ue(v) afps_vmc_ext_vertex_property_ti_partition_rows_ PATENT Atty. Docket No. SYP349901WO02/SONY-76300WO height_minus1[ vpID ] ue(v) } else { afps_vmc_ext_vertex_property_ti_num_partition_ columns_minus1[ vpID ] ue(v) afps_vmc_ext_vertex_property_ti_num_partition_ rows_minus1[ vpID ] ue(v) for( i = 0; i < afps_vmc_ext_vertex_property_ti_num_ partition_columns_minus1[ vpID ]; i++ ) afps_vmc_ext_vertex_property_ti_partition_ column_width_minus1[ vpID ][ i ] ue(v) for( i = 0; i < afps_vmc_ext_vertex_property_ti_num_ partition_rows_minus1[ vpID ]; i++ ) afps_vmc_ext_vertex_property_ti_partition_row_ height_minus1[ vpID ][ i ] ue(v) } afps_vmc_ext_vertex_property_ti_single_partition_per_tile_fl ag[ vpID ] u(1) if( !afps_vmc_ext_vertex_property_ti_single_ partition_per_tile_flag[ vpID ] ) { afps_vmc_ext_vertex_property_ti_num_tiles_ in_atlas_frame_minus1[ vpID ] ue(v) for( i = 0; i < afps_vmc_ext_vertex_property_ti_num_tiles_ in_atlas_frame_minus1[ vpID ] + 1; i++ ) { afps_vmc_ext_vertex_property_ti_top_left_ partition_idx[ vpID ][ i ] ue(v) afps_vmc_ext_vertex_property_ti_bottom_right_ partition_column_offset[ vpID ][ i ] ue(v) afps_vmc_ext_vertex_property_ti_bottom_right_ PATENT Atty. Docket No. SYP349901WO02/SONY-76300WO partition_row_offset[ vpID ][ i ] ue(v) } } } Patch Data Unit mesh_intra_data_unit( tileID, patchIdx ) { Descriptor if( asps_vmc_ext_patch_mesh_enable_flag ) { mdu_mesh_present_flag[ tileID ][ patchIdx ] u(1) if( mdu_mesh_present_flag[ tileID ][ patchIdx ] ) { mdu_vertex_count_minus1[ tileID ][ patchIdx ] ue(v) mdu_triangle_count_minus1[ tileID ][ patchIdx ] ue(v) for( i =0; i < 3; i++ ) { mdu_3d_bounding_box_ min[ tileID ][ patchIdx ][ i ] ue(v) mdu_3d_bounding_box_ max[ tileID ][ patchIdx ][ i ] ue(v) } } } if( asps_vmc_ext_patch_mapping_method == 2 ) { mdu_num_facegroups[ tileID ][ patchIdx ] ue(v) for( i = 0; i < mdu_num_facegroups[ tileID ][ patchIdx ]; i++) mdu_facegroup_id[ tileID ][ patchIdx ][ i ] ue(v) } if( ampi_atlas_mapping_method == 0 ) { mdu_2d_pos_x[ tileID ][ patchIdx ] ue(v) PATENT Atty. Docket No. SYP349901WO02/SONY-76300WO mdu_2d_pos_y[ tileID ][ patchIdx ] ue(v) mdu_2d_size_x_minus1[ tileID ][ patchIdx ] ue(v) mdu_2d_size_y_minus1[ tileID ][ patchIdx ] ue(v) mdu_projection_id[ tileID ][ patchIdx ] u(v) mdu_orientation_index[ tileID ][ patchIdx ] u(v) if( afps_lod_mode_enabled_flag ) { mdu_lod_enabled_flag[ tileID ][ patchIdx ] u(1) if( mdu_lod_enabled_flag[ tileID ][ patchIdx ] > 0 ) { mdu_lod_scale_x_minus1[ tileID ][ patchIdx ] ue(v) mdu_lod_scale_y_idc[ tileID ][ patchIdx ] ue(v) } } } mdu_patch_parameters_enable_flag[ tileID ][ patchIdx ] u(1) if( mdu_patch_parameters_enable_flag ) { mdu_subdivision_enable_flag[ tileID ][ patchIdx ] u(1) mdu_displacement_enable_flag[ tileID ][ patchIdx ] u(1) } if( mdu_subdivision_enable_flag[ tileID ][ patchIdx ] ) { mdu_subdivision_method[ tileID ][ patchIdx ] u(3) mdu_subdivision_iteration_count[ tileID ][ patchIdx ] u(8) } if( mdu_displacement_enabled_flag ) { mdu_displacement_coordinate_system_enabled_flag u(1) cFlag = mdu_displacement_coordinate_system_enabled_flag mdu_transform_index_enabled_flag u(1) tFlag = mdu_transform_index_enabled_flag PATENT Atty. Docket No. SYP349901WO02/SONY-76300WO mdu_transform_parameters_enabled_flag u(1) peFlag = mdu_transform_parameters_enabled_flag mdu_attribute_parameter_override_flag u(1) poFlag = mdu_attribute_parameter_override_flag displacement_processing_ information( cFlag, tFlag, peFlag, poFlag, 2 ) } } mdu_mesh_present_flag[ tileID ][ p ] equal to 1 specifies that the syntax elements related to the mesh are present in the patch p in the current atlas tile, with tile ID equal to tileID. mdu_mesh_present_flag equal to 0 specifies that the syntax elements related to mesh are not present. mdu_vertex_count_minus1[ tileID ][ p ] plus one specifies the number of vertices in the the patch with index p of the current atlas tile, with tile ID equal to tileID. If not present, this value shall be determined by processing the base mesh data unit. mdu_triangle_count_minus1[ tileID ][ p ] plus one specifies the number of triangles in the the patch with index p of the current atlas tile, with tile ID equal to tileID. If not present, this value shall be determined by processing the base mesh data unit. mdu_3d_bounding_box_min[ tileID ][ p ][ i ] specifies the minimum value for the bounding box of the patch with index p of the current atlas tile, with tile ID equal to tileID, along the ith axis. The value of mdu_3d_bounding_box_min[ tileID ][ p ][ i ] shall be in the range of 0 to 2 asps_geometry_3d_bit_depth_minus1 + 1 - 1, inclusive. The number of bits used to represent mdu_3d_bounding_box_min[ ][ p ] is asps_geometry_3d_bit_depth_minus1 + 1. If not present, this value shall be determined by processing the base mesh data unit. mdu_3d_bounding_box_max[ tileID ][ p ][ i ] specifies the maximum value for the bounding box of the patch with index p of the current atlas tile, with tile ID equal to tileID, along PATENT Atty. Docket No. SYP349901WO02/SONY-76300WO the ith axis. The value of mdu_3d_bounding_box_max[ tileID ][ p ][ i ] shall be in the range of 0 to 2 asps_geometry_3d_bit_depth_minus1 + 1 - 1, inclusive. The number of bits used to represent mdu_3d_bounding_box_max tileID ][ p ] is asps_geometry_3d_bit_depth_minus1 + 1. If not present, this value shall be determined by processing the base mesh data unit. mdu_2d_pos_x[ tileID ][ p ] specifies the x-coordinate of the top-left corner of the patch bounding box size for the patch p in the current atlas tile, with tile ID equal to tileID, expressed as a multiple of PatchPackingBlockSize. mdu_2d_pos_y[ tileID ][ p ] specifies the y-coordinate of the top-left corner of the patch bounding box size for the patch p in the current atlas tile, with tile ID equal to tileID, expressed as a multiple of PatchPackingBlockSize. mdu_2d_size_x_minus1[ tileID ][ p ] plus 1 specifies the width value of the patch with index p in the current atlas tile, with tile ID equal to tileID. mdu_2d_size_y_minus1[ tileID ][ p ] plus 1 specifies the height value of the patch with index p in the current atlas tile, with tile ID equal to tileID. mdu_projection_id[ tileID ][ p ] specifies the values of the projection mode and of the index of the normal to the projection plane for the patch with index p of the current atlas tile, with tile ID equal to tileID. The value of mdu_projection_id[ tileID ][ p ] shall be in range of 0 to asps_max_number_projections_minus1, inclusive. The number of bits used to represent mdu_projection_id[ tileID ][ p ] is Ceil( Log2( asps_max_number_projections_minus1 + 1) ). mdu_orientation_index[ tileID ][ p ] specifies the patch orientation index, for the patch with index p of the current atlas tile, with tile ID equal to tileID, used to determine the matrices P, R and O, as indicated in the Eq. (1) and Table 1, respectively for reconstruction. mdu_lod_enabled_flag[ tileID ][ p ] equal to 1 specifies that the LOD parameters are present for the current patch p of the current atlas tile, with tile ID equal to tileID. If mdu_lod_enabled_flag[ tileID ][ p ] is equal to 0, no LOD parameters are present for the current PATENT Atty. Docket No. SYP349901WO02/SONY-76300WO patch. If mdu_lod_enabled_flag[ tileID ][ p ] is not present, its value shall be inferred to be equal to 0. mdu_lod_scale_x_minus1[ tileID ][ p ] specifies the LOD scaling factor to be applied to the local x coordinate of a point in a patch with index p of the current atlas tile, with tile ID equal to tileID, prior to its addition to the patch coordinate TilePatch3dOffsetU[ tileID ][ p ]. If mdu_lod_scale_x_minus1[ tileID ][ p ] is not present, its value shall be inferred to be equal to 0. mdu_lod_scale_y_idc[ tileID ][ p ] indicates the LOD scaling factor to be applied to the local y coordinate of a point in a patch with index p of the current atlas tile, with tile ID equal to tileID, prior to its addition to the patch coordinate TilePatch3dOffsetV[ tileID ][ p ]. If mdu_lod_scale_y_idc[ tileID ][ p ] is not present, its value shall be inferred to be equal to 0. mesh_inter_data_unit( tileID, patchIdx ), mesh_merge_data_unit( tileID, patchIdx ), and mesh_skip_data_unit( tileID, patchIdx ) are able to be easily defined. However, the usage of mesh_raw_data_unit( tileID, patchIdx ) should be further discussed by the group. Notice that if tileId = submeshID (which has the same independent decoding functionality) is used, there is no need to transmit a different ID to link the patch to the submesh, and the tileID is able to be used for that purpose. Reconstruction For generating the connectivity associated with orthoAtlas projected patches then the following applies: Variables pIdx, oIdx, lodX, and lodY are assigned as follows: occRes = PatchPackingBlockSize U0 = TilePatch2dPosX[ tileID ][ p ]/occRes V0 = TilePatch2dPosY[ tileID ][ p ]/occRes widthOccCC = TilePatch2dSizeX[ tileID ][ p ]/occRes heightOccCC = TilePatch2dSizeY[ tileID ][ p ]/occRes lodX = TilePatchLoDScaleX[ tileID ][ p ] * ampi_ortho_atlas_width_scale PATENT Atty. Docket No. SYP349901WO02/SONY-76300WO lodY = TilePatchLoDScaleY[ tileID ][ p ] * ampi_ortho_atlas_height_scale pID = TilePatchProjectionID[ tileID ][ p ] oIdx = TilePatchOrientationIndex[ tileID ][ p ] width = asps_frame_width height = asps_frame_height Variables BB min (x), Bbmin (y), BB min (z), BB max (x), BB max (y), BB max (z), are assigned as follows: BB min (x) = TilePatch3dBoundingBoxMin[ tileID ][ p ][ 0 ] BB min (y) = TilePatch3dBoundingBoxMin[ tileID ][ p ][ 1 ] BB min (z) = TilePatch3dBoundingBoxMin[ tileID ][ p ][ 2 ] BB max (x) = TilePatch3dBoundingBoxMin[ tileID ][ p ][ 0 ] BB max (y) = TilePatch3dBoundingBoxMin[ tileID ][ p ][ 1 ] BB max (z) = TilePatch3dBoundingBoxMin[ tileID ][ p ][ 2 ] Then, for each sample with coordinates ( x, y, z ) that belongs to the reconstructed vertex position, the conversion to the 2D u , v coordinate system is performed by first transforming ( x, y, z ) to a local patch coordinate pair ( u * , v * ), as follows: where: PATENT Atty. Docket No. SYP349901WO02/SONY-76300WO Then, the final (u, v) coordinate is obtained using the following formula: where: oIdx R(oIdx) O(oIdx) Description 0 0E No transformation Table 1 Figure 4 illustrates a block diagram of an exemplary computing device configured to implement the mesh compression method according to some embodiments. The computing device 400 is able to be used to acquire, store, compute, process, communicate and/or display PATENT Atty. Docket No. SYP349901WO02/SONY-76300WO information such as images and videos including 3D content. The computing device 400 is able to implement any of the encoding/decoding aspects. In general, a hardware structure suitable for implementing the computing device 400 includes a network interface 402, a memory 404, a processor 406, I/O device(s) 408, a bus 410 and a storage device 412. The choice of processor is not critical as long as a suitable processor with sufficient speed is chosen. The memory 404 is able to be any conventional computer memory known in the art. The storage device 412 is able to include a hard drive, CDROM, CDRW, DVD, DVDRW, High Definition disc/drive, ultra-HD drive, flash memory card or any other storage device. The computing device 400 is able to include one or more network interfaces 402. An example of a network interface includes a network card connected to an Ethernet or other type of LAN. The I/O device(s) 408 are able to include one or more of the following: keyboard, mouse, monitor, screen, printer, modem, touchscreen, button interface and other devices. Mesh compression application(s) 430 used to implement the mesh compression implementation are likely to be stored in the storage device 412 and memory 404 and processed as applications are typically processed. More or fewer components shown in Figure 4 are able to be included in the computing device 400. In some embodiments, mesh compression hardware 420 is included. Although the computing device 400 in Figure 4 includes applications 430 and hardware 420 for the mesh compression implementation, the mesh compression method is able to be implemented on a computing device in hardware, firmware, software or any combination thereof. For example, in some embodiments, the mesh compression applications 430 are programmed in a memory and executed using a processor. In another example, in some embodiments, the mesh compression hardware 420 is programmed hardware logic including gates specifically designed to implement the mesh compression method.
PATENT Atty. Docket No. SYP349901WO02/SONY-76300WO In some embodiments, the mesh compression application(s) 430 include several applications and/or modules. In some embodiments, modules include one or more sub-modules as well. In some embodiments, fewer or additional modules are able to be included.
PATENT Atty. Docket No. SYP349901WO02/SONY-76300WO Examples of suitable computing devices include a personal computer, a laptop computer, a computer workstation, a server, a mainframe computer, a handheld computer, a personal digital assistant, a cellular/mobile telephone, a smart appliance, a gaming console, a digital camera, a digital camcorder, a camera phone, a smart phone, a portable music player, a tablet computer, a mobile device, a video player, a video disc writer/player (e.g., DVD writer/player, high definition disc writer/player, ultra high definition disc writer/player), a television, a home entertainment system, an augmented reality device, a virtual reality device, smart jewelry (e.g., smart watch), a vehicle (e.g., a self-driving vehicle) or any other suitable computing device. To utilize the mesh compression method, a device acquires or receives 3D content (e.g., point cloud content). The mesh compression method is able to be implemented with user assistance or automatically without user involvement. In operation, the mesh compression method provides the flexibility to signal projection based atlas parametrization method, e.g., “orthoAtlas,” including: description and the use of V3C syntax structures and extensions to allow the derivation of (u, v) coordinates on the decoder side rather than carrying the (u, v) coordinates in the base mesh sub-bitstream as it is currently practiced; keeping the V3C tile header syntax unchanged with one-to-one correspondence between single tile and single sub-mesh. In comparison to previous implementations, this change keeps the V3C tile header syntax structure unchanged. The signaling of the association between tile and sub-meshes is performed as part of the afps extension. Using V3C atlas tile partition information instead of its extension; The use of new patch types similar to the ones introduced previously (e.g., I_MESH) but with minimal modifications to the original V3C patch data type syntax structure. The mesh compression method separates and makes explicit distinction between geometry and displacement sub-bitstreams in comparison with the approach taken previously, whereas the geometry sub-bitstream is used to carry displacement information. As described herein, the displacement is carried in a vertex property sub-bitstream. Introducing such a separate PATENT Atty. Docket No. SYP349901WO02/SONY-76300WO sub-bitstream causes various vertex processing operations in the future (e.g., mesh tracking and vertex correspondence). Also with this change, the geometry sub-bitstream remains semantically consistent with the current V3C geometry sub-bitstream, providing us with the potential for possible enhancement of surface reconstruction process. The mesh compression method introduces a generic module for “connectivity” processing, with “subdivision” method being as one of its instances. Rather than using Draco-specific “attribute” terms, the generic term “property” is used to name all the data associated with a mesh/sub-mesh, as is done in “ply” file format, for example. A number of “flags” are used to provide the option to use/bypass certain V-DMC methods and considering also the texture attribute map video as a V3C attribute component video. The mesh compression method enables signaling patchIds and facegroupIds in the mesh bitstream as association between (patchId, facegroupId) pairs allowing for a more generic and not Draco-specific method of signaling. SOME EMBODIMENTS OF V3C SYNTAX EXTENSION FOR MESH COMPRESSION 1. A method programmed in a non-transitory memory of a device comprising: implementing atlas mapping processing; performing connectivity processing to divide triangles to generate vertex information; and performing vertex position processing to adjust positions of the vertices in the vertex information. 2. The method of clause 1 further comprising: implementing a displacement sub-bitstream to include displacement information, wherein the displacement information is utilized by the vertex position processing. PATENT Atty. Docket No. SYP349901WO02/SONY-76300WO 3. The method of clause 2 further comprising reconstructing a mesh based on the displaced vertices and the connectivity information. 4. The method of clause 1 wherein atlas mapping processing includes: receiving patch identification information and mapping function parameters; and generating (u,v) coordinates based on the patch identification and mapping function parameters. 5. The method of clause 1 further comprising utilizing one or more flags to bypass one or more V-DMC methods and to use a texture attribute map video as a V3C attribute component video. 6. The method of clause 1 wherein implementing atlas mapping processing occurs before performing connectivity processing and vertex position processing. 7. The method of clause 1 wherein implementing atlas mapping processing occurs after performing connectivity processing and vertex position processing. 8. An apparatus comprising: a non-transitory memory for storing an application, the application for: implementing atlas mapping processing; performing connectivity processing to divide triangles to generate vertex information; and performing vertex position processing to adjust positions of the vertices in the vertex information; and a processor coupled to the memory, the processor configured for processing the application. PATENT Atty. Docket No. SYP349901WO02/SONY-76300WO 9. The apparatus of clause 8 wherein the application is further for: implementing a displacement sub-bitstream to include displacement information, wherein the displacement information is utilized by the vertex position processing. 10. The apparatus of clause 9 wherein the application is further for reconstructing a mesh based on the displaced vertices and the connectivity information. 11. The apparatus of clause 8 wherein atlas mapping processing includes: receiving patch identification information and mapping function parameters; and generating (u,v) coordinates based on the patch identification and mapping function parameters. 12. The apparatus of clause 8 wherein the application is further for utilizing one or more flags to bypass one or more V-DMC methods and to use a texture attribute map video as a V3C attribute component video. 13. The apparatus of clause 8 wherein implementing atlas mapping processing occurs before performing connectivity processing and vertex position processing. 14. The apparatus of clause 8 wherein implementing atlas mapping processing occurs after performing connectivity processing and vertex position processing. 15. A system comprising: an encoder configured for encoding a 3D mesh to generate patch identification information and mapping function parameters; and a decoder configured for: PATENT Atty. Docket No. SYP349901WO02/SONY-76300WO implementing atlas mapping processing on the patch identification information and the mapping function parameters; performing connectivity processing to divide triangles to generate vertex information; and performing vertex position processing to adjust positions of the vertices in the vertex information. 16. The system of clause 15 wherein the decoder is further for: implementing a displacement sub-bitstream to include displacement information, wherein the displacement information is utilized by the vertex position processing. 17. The system of clause 16 wherein the decoder is further for reconstructing a mesh based on the displaced vertices and the connectivity information. 18. The system of clause 15 wherein atlas mapping processing includes: receiving patch identification information and mapping function parameters; and generating (u,v) coordinates based on the patch identification and mapping function parameters. 19. The system of clause 15 wherein the decoder is further for utilizing one or more flags to bypass one or more V-DMC methods and to use a texture attribute map video as a V3C attribute component video. 20. The system of clause 15 wherein implementing atlas mapping processing occurs before performing connectivity processing and vertex position processing. PATENT Atty. Docket No. SYP349901WO02/SONY-76300WO 21. The system of clause 15 wherein implementing atlas mapping processing occurs after performing connectivity processing and vertex position processing. The present invention has been described in terms of specific embodiments incorporating details to facilitate the understanding of principles of construction and operation of the invention. Such reference herein to specific embodiments and details thereof is not intended to limit the scope of the claims appended hereto. It will be readily apparent to one skilled in the art that other various modifications may be made in the embodiment chosen for illustration without departing from the spirit and scope of the invention as defined by the claims.
Next Patent: METHOD FOR MIXING FLUIDS FOR A PLASMA CUTTING TORCH