How to visualize gesture (BVH) using Blender?
November 15, 2023 ยท View on GitHub
This is a video demo of using Blender to visualize 3D gestures.
You can download the relevant files and plug-ins in the video from Google Cloud or Baidu Cloud.
You can visualize the results of the BEAT dataset (i.e., DiffuseStyleGesture+), with the tutorial above.
If you want to visualize the results of the ZEGGS dataset (i.e., DiffuseStyleGesture), you will need to perform an additional step because the initial position of the skeleton for ZEGGS is not a T-pose, as shown at approximately 13:17 in the video. [See this issue]
- With the rendering code provided by Ubisoft, after you run the
BVH2FPXcode it provides you get a newfpxfile, import it into Blender with thebetter FPX importer plugin(Provided in the link above) and you can continue with the skeleton retargeting as in the tutorial. I would suggest that you can download and installMotionbuilderversion 2020 as this gives new users a month of free use. - There is another way to utilize
Mayaand manually set the initial position of the generated result to the T-pose, but this method is a bit more troublesome and difficult, I can record another tutorial sometime afterward, and I would recommend using the first method.