Skip to content

Instantly share code, notes, and snippets.

@mikelyndon
Created March 15, 2023 16:53
Show Gist options
  • Save mikelyndon/e8431f9cbb7e2f7d88e9a56d11867a47 to your computer and use it in GitHub Desktop.
Save mikelyndon/e8431f9cbb7e2f7d88e9a56d11867a47 to your computer and use it in GitHub Desktop.
Pseudo code and notes for generating a vertex animation texture for a fixed point count deforming geometry.

Vertex Animation Textures

Prerequisites

  • Normals
  • UVs

Create orient attribute (allows us to update normals in the vertex shader)

  • Create normal (N) attribute if missing
  • Create tangentu attribute (preferably MikkT)
  • Create matrix3 from N and tangentu
  • Convert matrix3 to orient attribute (quaternion)
N = normalize(N);
vector tangentu = length(tangentu) > 0.01 ? normalize(tangentu) : {1, 0, 0};

matrix3 m = maketransform(normalize(cross(tangentu, N)), N);
quaternion orient = normalize(quaternion(m));

Calculate position delta

  • For each point calculate the vector delta from the first frame to the current frame.
vector delta = P_current_frame[i] - P_first_frame[i]
  • If no colour exists add a default vertex colour of 1,1,1

Create a copy of the geometry for every frame in the animation.

  • This is just a way to access attributes for each frame when generating the texture. This could also be done as a loop when writing the values to the texture.
  • It can also help with the next step of the process.

Calculate the min and max bounds for the entire animation.

Mesh Preparation

  • Scale to match target platform (x100)
  • Triangulate mesh (more consistent that trusting the target platform)
  • Create uv2
    • Get max texture width (eg. 1024)
    • Get total frame count (end frame - start frame + 1)
    float max_width = 1024;
    float numpt = (total number of points); 
    float frame_count = 100;
    
    float width = min(max_width, numpt);
    float rows = ceil(numpt / max_width);
    
    float unit_size = 1.0 / width;
    float unit_size_y = 1.0 / rows / frame_count;
    
    float i = (float)point_index;
    
    uv2.x = (0.5 + i - floor(i / width ) * width) * unit_size;
    
    uv2.y = 1.0 - (floor(i / width) * unit_size_y) - 0.5 * unit_size_y;
  • Add two triangles to the mesh. One at the min and one at the max bound value.
    • This helps us normalize value when exporting the texture, and then expanding them back to world space in the shader.
    • You might be tempted to add a single vertex but this can be problematic with some platforms that will automatically cull vertices that aren't associated with a triangle.
  • Add tangentv attribute to geometry
vector tangentv = cross(N, tangentu);
vector tangentv = normalize(tangentv);

if ("binormal_mode" == 1) {
  tangentv *= -1.0;
}
  • Export mesh as gltf/fbx

Generate position texture

  • Dimensions = x: number of points in geo, y: number of frames in anim
  • For each pixel in x get the P value of the corresponding point and for each pixel in y get the frames P value.
for x in texture_width:
  for y in total_frames:
    return getP_atFrame(x,y)
    //x is P index
    //y is Frame index
  • You might need to swap components or flip coords depending on the handedness of the 3D software and the target platform.
    • eg. swap y with z, negate x
  • Set alpha to 1 (white). Some platforms will premultiply alpha regardless of settings.
    • You could also use alpha to store pscale.

Generate orient texture

  • Same approach as position texture except swapping and flipping the quaternion components can be a bit tricky.
@donmccurdy
Copy link

@mikelyndon do I understand correctly that the position texture stores a delta from the first frame, rather than an absolute floating-point position? I see the delta calculated but not sure where that is being used. Is that also true for the orient texture?

@mikelyndon
Copy link
Author

Position texture stores the delta because we export a static mesh at the first frame of the sequence. The texture is used to update the position as a delta. You could do absolute but deltas generally have less precision issues.
Orient is just the orientation at the current frame of the normal.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment