ADR-0031 VR Designer preparation steps for Alembic Files from VStitcher to Unity – UC1

Publication Date0001-01-01
Last Update0001-01-01
StatusAccepted
ReferencesADR-0026 Use Browzwear parametric avatars for UC3

Context

For Use Case 1, the VR Designer app, a designer needs to create avatar+garment+animation alembic files in Browzwear’s VStitcher. The exported alembic files are then inserted in Unity and are embodied in the VR Scene. Inside Unity, the alembic files require some preparation in order to be correctly utilized inside the scene. The process can be summarized as:

The designer (ODLO):

  1. Creates a 3D garment (pattern, stitch, materials, textures).
  2. Dresses it on an avatar (male/female).
  3. Inserts an animation the dressed avatar will perform. Browzwear has some animations built in, and animations from other sources (e.g. Mixamo) can be imported and used.
  4. Creates a simulation sequence with the avatar/garment and the selected animation.
  5. Exports the final result as an Browzwear alembic file (.abc) with all the textures.
  6.  (Optional) Additional texture exports can be used on an exported abc file without the need to create the whole alembic export again.
    

VR developer (CERTH):

  1. Creates a scene in Unity, representing a fitting/designer room. The scene can possibly change based on the designer suggestions (e.g. seasonal clothes).
  2. Building on the codebase at https://gitlab.com/etryon/uc1/vr-designer-app, inserts the provided alembic file in Unity and performs some preparation steps.
    1. The alembic file does not contain baked textures. VStitcher exports the textures (albedo, specular and normal maps) and the alembic file separately. Therefore, the textures need to be applied correctly inside Unity.
    2. Alembic’s baked animation, needs to be set and enabled inside Unity as well, so that it can automatically play when the application runs.
  3. A UI is needed inside the application so that the user can navigate through their options easily. Specifically, the UI needs to show all the available garment/avatar/animation options that the VR user can choose freely on runtime. The above actions at the moment need to be manually set up which slows down the preparation but we are in the process of automating it.
  4. For the forthcoming pilot pilots, we will use the above approach. In the meantime we will be trying to evolve the automation process.
  5. Future: The .abc files created by ODLO will be uploaded in Firebase cloud. CERTH needs to connect Unity with Firebase repository and fetch the existing files inside Unity automatically. Then continue to the above mentioned steps.

Decision

  1. We will use parametric avatars created by Browzwear instead of the scanned models for the other use cases. Based on designer feedback, the garment is much more important than how the avatar looks, and since VStitcher provides very good avatar options, an externally scanned avatar is not required.

Consequences

  1. UC1 is depended on its functionality on ODLO and CERTH. Initially, UC1 would use the scanned avatars from Metail/QuantaCorp ( ADR-0012 Metail Scanatar Creation) but since Browzwear provides sufficient avatar options, this UC won’t need them.
  2. We are in discussions about what exact options we will provide with the UI inside the scene and how the scene will look in general.
  3. When the .abc files are uploaded to Firebase cloud, a specific format or possibly some naming convention for the files will be used so that Unity can automatically fetch and prepare the avaialable garments.
  4. The admin / back office CLI will no longer be used by Odlo, will add little value and therefore won’t be developed.